FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 01-27-2002, 08:10 AM   #31
Senior Member
 
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
Post

Quote:
Originally posted by Filip Sandor:
<strong>Anyway, we have both finished our demonstrations. I have showed you the function of a printer and you have showed me the function of your brain.... except there is one little problem.........

Where is your end product (mental image)??
</strong>
In my brain, of course. I don't see anything particularly difficult with that.

Are you asking me, "How are thoughts stored in the brain?" If so, then my only answer to you is that I have no idea. Presumably they are stored in the form of neuronal connections or perhaps certain neuron firing patterns, but I really don't know enough to say. How is this a problem?

Clearly there are conscious and unconscious parts of the mind. Freud even labelled certain areas such as memory "pre-conscious" since while they weren't kept in the conscious part all the time, they could be focused on and "brought forth" into the conscious area.

So I would assume that when you construct a mental image, you take bits and pieces of things from your memory, create a mental image of them in your the conscious part of your mind, and when you are no longer focusing on it, the whole thing slips right back into memory.

Perhaps you can tell me how this is a problem for either materialism, or for a functional view of the mind?

Edited for spelling and to add:

Now, I would say that a "mental image" is simply a set of visual symbols under consideration of the conscious mind. There's no actual "picture" there. Whatever format your brain uses to encode and store visual information would be the same format your brain would use when conjuring a mental image.

Daniel "Theophage" Clark

[ January 27, 2002: Message edited by: Theophage ]</p>
Theophage is offline  
Old 01-27-2002, 08:33 AM   #32
Senior Member
 
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
Post

Hello again Kharakov,

In your first posts here, I was ready to dismiss you as a crank, but clearly I was mistaken. I think this post here brings up some interesting ideas, and I will be happy to respond to them. thing

Quoted material by Kharakov will be in bold:

I disagree with your statement that there is any empirical evidence that toy trucks do not possess consciousness.


I will try to show how I believe there is such empirical evidence. But since I don't want to misunderstand you further, would you also say that even though there may be no empirical evidence of such, that there is some other reason for believing that toy trucks do not posess consciousness? In other words, do you consider the probability that toy trucks are conscious to be equal to the probability that they are not?

I don't think a conscious object has to communicate with or display behaviour detectable by objects besides itself in order to be conscious.


I agree with you here, conscioussness does not have to demonstrate itself to exist, but that is irrelevant. In order for me to have reason to believe that something is conscious, I must have some sort of demonstration. Given no demonstration, the reasonable conclusion is that a thing is not conscious. Do you agree or disagree?

Now, you may ask, why is that? simply because of the fact that we humans recognize what we consider "consciousness" in others. We recognize that others are conscious by their behaviors. We also see things which we don't consider conscious. When deciding which group to place a new thing in, we compare the bahavior of the new thing to the bahaviors of the conscious and non-conscious groups.

"Does the thing behave like a conscious thing, or a non-conscious thing?" we ask ourselves. Since a toy truck behaves like a non-conscious thing, and not like a conscious thing, our reason tells us that it should be classified in the non-conscious group. It is this examination of behavior that is the empirical evidence I was talking about. Do you deny that observation of behvior is empirical evidence?

Now I understand you objection that our observations are limited and may be wrong. The toy truck may indeed be conscoius, and simply not be displaying anythign that we would consider consciuos behavoir. Sure that could be true, but that is not the point. Reason, unlike deductive logic, does not tell us whatmust be true or must be false. it only tells us what is more likely true or more likely false given the evidence.

Since there are two pieces of empirical evidence that toy trucks are not conscious ( 1. they display the behaviors of other non-conscious things, and 2. they display no behaviors of conscius things) but zero empirical evidence that they are conscious, the conclusion due to reason is clearly that they are not conscious.

I hope I have explained this simply enough and yet still retained whatever it is you were looking for in my answer.

It seems to me (and I may be misreading you) that you are saying that since we cannot be 100% sure that the toy is not conscious, that we cannot come to any conclusion about it. If that is what you mean, I find that just plain silly.

You also seem to be saying that while empirical evidence might be there (as in the case of zombies, etc) that it may be wrong, and is therefore invalid or useless as evidence. I find this position sill also, and would say that "evidence" is certainly different from "proof", so of course evidence can be wrong ro lead to wrong conclusions. But that has nothing to do with whether or not a conclusion is justified by evidence.

Hopefully you can clear these matters up next time.

Daniel "Theophage" Clark
Theophage is offline  
Old 01-27-2002, 01:54 PM   #33
Veteran Member
 
Join Date: Oct 2001
Location: Canada
Posts: 3,751
Post

Filip, your argument relies entirely on the (so far as I've seen) completely unexplained notion of a "characteristic difference". It sounds very much like you simply mean "essential property", which would (i) presuppose a pellucid notion of essential properties, and (ii) mean that your argument is just Descartes' argument.

What are characteristic properties, and how are they identified? If they are identified a posteriori, I can't see any reason to conclude a priori that (eg) having certain physico-causal relations to certain neural inputs couldn't turn out to be a characteristic property of the visual experience of an edge qua brain process. After all, grasping either one of the concepts "water" or "H2O" does not tell you that the other concept even exists, never mind that it picks out the same stuff; it would be very odd to make armchair claims about what water could and could not be, solely on the basis of reviewing the concept of water. That two concepts have different contents is no proof, a priori, that they do not have the same extensions.

For what it's worth, this is also exactly why UE's premise (5) is risible, in his much-self-vaunted argument against materialism. It assumes that co-reference should be derivable from the contents of a concept. As "morning star" vs "evening star" shows, however, even if UE's (5) were true it would show nothing about materialism. Materialism does not assert that the physicality of a class of objects should be logically entailed by every concept characterizing the class.
Clutch is offline  
Old 01-27-2002, 02:10 PM   #34
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

It may be because we're coming at this from different sides Filip, but I don't see what I haven't addressed. That materialism implies one should be able to see a mental image on an MRI is a figment of your imagination. It is your problem, not mine.
tronvillain is offline  
Old 01-27-2002, 02:22 PM   #35
Senior Member
 
Join Date: Feb 2001
Location: Toronto
Posts: 808
Post

I just want to inject a quick note that may be of use here as an analogy.

The mind as a function of the brain needs to be understood properly to understand materialisim. The printer example came up as an example which illustrates that the brain does not produce something that is 'showable'.

I think a better analogy would be the CPU of a computer with the monitor unplugged. There is no way to know what the chip is 'thinking' by looking at an electron-path-scan of the CPU (unless you know everything about the chip, and even then its a monumental task). The software is still purely a function of the chip (much like the mind), but there is no way to know what the chip is thinking.

This is a good way to understand the materialist view of mind, since software is also an emergant function of complex hardware. The primary difference is that we write our own mind as we grow up.
Christopher Lord is offline  
Old 01-27-2002, 08:27 PM   #36
Senior Member
 
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
Post

Unfortunately, we didn't design the hardware, so how it works is still a mystery to us. Thank you for a much better analogy.
Theophage is offline  
Old 01-27-2002, 08:41 PM   #37
Regular Member
 
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
Post

Quote:
Originally posted by Adrian Selby:
By words existing mentally, in one's mind, do you mean they exist as ontologically distinct from neural nets that recognise symbols and the associations and applications of those symbols.
Apparently it would seem so and so my answer is yes.

However, I have trouble imagining that neural nets can 'recognize' anything at all; just as I would have trouble imagining that a vehicle is aware of the parts that it 'utilizes'.. but that's a whole other debate.

Quote:
Only the recognition of words might just be what happens when a human brain that has learned to recognise those arrangements of shapes in the visual field is able to respond to them in a meaningful way.....
Again, I have a difficult time imagining that our brains, or any part of our brains, are capable of 'learning' anything, let alone 'responding in a meaningful way'. It think this is as inaccurate as assuming pressing my finger into play-doh 'teaches' my finger print to the play-doh and that the play-doh 'learns' this and 'responds in a meaningful way' by molding itself accordingly.

Maybe I am under the illusion that I possess free will.. but again, that's a whole other debate too.

Without getting sidetracked though, by the details of what may possess free will or exactly what perceives things, I think it is quite apparent that we are lacking physically evidence of the existence of mental phenomena... but does this mean that mental phenomena doesn't exist?

Is this 'converstation' not real? How can we be aware of something that doesn't exist??

Quote:
On the subject of toy trucks, I'd argue that for something to be conscious it must have a certain complexity with regard to a specific kind of 'matter', namely, like neurons, matter that is akin to logic gates, whether it be circuitry or neurons, to give two examples.
What do you mean when you say that not all matter is akin to logic states? More specifically, how would you define a 'logic state' in physical terms?

Quote:
Trucks do not have this kind of matter, e.g. neural nets, and do not have sufficient complexity in the arrangement of this matter. Quite how to draw the lines is a problem, but it is obvious that a duck can be said to be conscious and a toy truck not, if only because on the above definition, the duck (barely) cuts the mustard.
According to the Funk n' Wagnals (1996) edition encyclopedia; there is currently (1996) no, widely accepted or agreed upon definition (amongst schollars) of what it is to be conscious (a.k.a. what it is to be aware).

I would conclude that this is still true today. <img src="graemlins/boohoo.gif" border="0" alt="[Boo Hoo]" />

It's very intriguing however, at least to me.. that even though it is so difficult and maybe even impossible to give a verbal definition of awareness, it is probably one of the things which we all know most intimately!

This really goes to show just how limitted our language really is.

Quote:
Self consciousness, being a higher order function again, would require a far greater degree of complexity. A criterion of complexity in the arrangement of the matter constituting the central nervous system seems to me on the surface to offer some kind of foundation for sorting out what can be and what can't be conscious.
Are you referring to what you described earlier as 'logic states'?
Filip Sandor is offline  
Old 01-27-2002, 09:10 PM   #38
Regular Member
 
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
Post

Quote:
Originally posted by Theophage:
In my brain, of course. I don't see anything particularly difficult with that.
The difficulty with the assumption that your thoughts are in your brain is that there is nothing in your brain that bears any qualitative resemblance to the mental 'objects' in your mind.

As far as anyone who is looking in your brain can tell, you don't have a mind at all.

This seems suggest, at least to me, a major flaw in materialism - The view that everything that actually exists is material, or physical.

[ Note: Many philosophers and scientists now use the terms `material' and `physical' interchangeably. ]

Quote:
Clearly there are conscious and unconscious parts of the mind.
Whenever I use the term mind I am referring to the sum of all the mental phenomena that is perceived by an individual (ie. thought, feeling, will, desire, etc.).

I personally don't believe that mental phenomenon (ie. thoughts, feelings, wills, desires, etc.) are conscious (or aware) of anything.

Quote:
I would assume that when you construct a mental image, you take bits and pieces of things from your memory..
The only bit problem here, which I am trying desperately to point out to everyone is that these 'bits and pieces' do not exhibit any physical evidence of their existence.

Materialism just doesn't seem to make any sense.

[ January 27, 2002: Message edited by: Filip Sandor ]</p>
Filip Sandor is offline  
Old 01-27-2002, 09:39 PM   #39
Regular Member
 
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
Post

Quote:
Originally posted by Clutch:
Filip, your argument relies entirely on the (so far as I've seen) completely unexplained notion of a "characteristic difference". It sounds very much like you simply mean "essential property", which would (i) presuppose a pellucid notion of essential properties, and (ii) mean that your argument is just Descartes' argument.
Sorry if I have confused you.

My analogy is actually quite simple -- it only appears complex when you apply it to mental phenomena. In essence, you can label just about anything as a 'quality' or 'property' of something, which tends to get quite confusing.

To put it more simply, I am exploiting the fact that there is a major difference in the 'qualities' or 'properties' of any mental phenomena and the 'qualities' or 'properties' of the physical phenomena that it corresponds to in the brain, whatever that may be.

Essentially.. we are faced with a consequential dilemma; if the mind is really a physical thing and it is in the brain then either the neuro-scientist who is looking at your brain is having a mass hallucination or you are having a mass hallucination.

The only other alternative is that neither of you is hallucinating and that both of you are actually perceiving different phenomena.
Filip Sandor is offline  
Old 01-27-2002, 09:57 PM   #40
Regular Member
 
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
Post

Quote:
Originally posted by tronvillain:
It may be because we're coming at this from different sides Filip, but I don't see what I haven't addressed.
You have not addressed the lack physical evidence of the mind, in the brain.

Quote:
That materialism implies one should be able to see a mental image on an MRI is a figment of your imagination. It is your problem, not mine.
in case you didn't realize.. materialism implies that mental phenomena is physical and that it exists in the brain. <img src="graemlins/boohoo.gif" border="0" alt="[Boo Hoo]" />

P.S. No, I am not suggesting that we should see mental images on an MRI scan -- I am not a materialist.
Filip Sandor is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 04:10 AM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.