Freethought & Rationalism ArchiveThe archives are read only. |
01-27-2002, 08:10 AM | #31 | |
Senior Member
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
|
Quote:
Are you asking me, "How are thoughts stored in the brain?" If so, then my only answer to you is that I have no idea. Presumably they are stored in the form of neuronal connections or perhaps certain neuron firing patterns, but I really don't know enough to say. How is this a problem? Clearly there are conscious and unconscious parts of the mind. Freud even labelled certain areas such as memory "pre-conscious" since while they weren't kept in the conscious part all the time, they could be focused on and "brought forth" into the conscious area. So I would assume that when you construct a mental image, you take bits and pieces of things from your memory, create a mental image of them in your the conscious part of your mind, and when you are no longer focusing on it, the whole thing slips right back into memory. Perhaps you can tell me how this is a problem for either materialism, or for a functional view of the mind? Edited for spelling and to add: Now, I would say that a "mental image" is simply a set of visual symbols under consideration of the conscious mind. There's no actual "picture" there. Whatever format your brain uses to encode and store visual information would be the same format your brain would use when conjuring a mental image. Daniel "Theophage" Clark [ January 27, 2002: Message edited by: Theophage ]</p> |
|
01-27-2002, 08:33 AM | #32 |
Senior Member
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
|
Hello again Kharakov,
In your first posts here, I was ready to dismiss you as a crank, but clearly I was mistaken. I think this post here brings up some interesting ideas, and I will be happy to respond to them. thing Quoted material by Kharakov will be in bold: I disagree with your statement that there is any empirical evidence that toy trucks do not possess consciousness. I will try to show how I believe there is such empirical evidence. But since I don't want to misunderstand you further, would you also say that even though there may be no empirical evidence of such, that there is some other reason for believing that toy trucks do not posess consciousness? In other words, do you consider the probability that toy trucks are conscious to be equal to the probability that they are not? I don't think a conscious object has to communicate with or display behaviour detectable by objects besides itself in order to be conscious. I agree with you here, conscioussness does not have to demonstrate itself to exist, but that is irrelevant. In order for me to have reason to believe that something is conscious, I must have some sort of demonstration. Given no demonstration, the reasonable conclusion is that a thing is not conscious. Do you agree or disagree? Now, you may ask, why is that? simply because of the fact that we humans recognize what we consider "consciousness" in others. We recognize that others are conscious by their behaviors. We also see things which we don't consider conscious. When deciding which group to place a new thing in, we compare the bahavior of the new thing to the bahaviors of the conscious and non-conscious groups. "Does the thing behave like a conscious thing, or a non-conscious thing?" we ask ourselves. Since a toy truck behaves like a non-conscious thing, and not like a conscious thing, our reason tells us that it should be classified in the non-conscious group. It is this examination of behavior that is the empirical evidence I was talking about. Do you deny that observation of behvior is empirical evidence? Now I understand you objection that our observations are limited and may be wrong. The toy truck may indeed be conscoius, and simply not be displaying anythign that we would consider consciuos behavoir. Sure that could be true, but that is not the point. Reason, unlike deductive logic, does not tell us whatmust be true or must be false. it only tells us what is more likely true or more likely false given the evidence. Since there are two pieces of empirical evidence that toy trucks are not conscious ( 1. they display the behaviors of other non-conscious things, and 2. they display no behaviors of conscius things) but zero empirical evidence that they are conscious, the conclusion due to reason is clearly that they are not conscious. I hope I have explained this simply enough and yet still retained whatever it is you were looking for in my answer. It seems to me (and I may be misreading you) that you are saying that since we cannot be 100% sure that the toy is not conscious, that we cannot come to any conclusion about it. If that is what you mean, I find that just plain silly. You also seem to be saying that while empirical evidence might be there (as in the case of zombies, etc) that it may be wrong, and is therefore invalid or useless as evidence. I find this position sill also, and would say that "evidence" is certainly different from "proof", so of course evidence can be wrong ro lead to wrong conclusions. But that has nothing to do with whether or not a conclusion is justified by evidence. Hopefully you can clear these matters up next time. Daniel "Theophage" Clark |
01-27-2002, 01:54 PM | #33 |
Veteran Member
Join Date: Oct 2001
Location: Canada
Posts: 3,751
|
Filip, your argument relies entirely on the (so far as I've seen) completely unexplained notion of a "characteristic difference". It sounds very much like you simply mean "essential property", which would (i) presuppose a pellucid notion of essential properties, and (ii) mean that your argument is just Descartes' argument.
What are characteristic properties, and how are they identified? If they are identified a posteriori, I can't see any reason to conclude a priori that (eg) having certain physico-causal relations to certain neural inputs couldn't turn out to be a characteristic property of the visual experience of an edge qua brain process. After all, grasping either one of the concepts "water" or "H2O" does not tell you that the other concept even exists, never mind that it picks out the same stuff; it would be very odd to make armchair claims about what water could and could not be, solely on the basis of reviewing the concept of water. That two concepts have different contents is no proof, a priori, that they do not have the same extensions. For what it's worth, this is also exactly why UE's premise (5) is risible, in his much-self-vaunted argument against materialism. It assumes that co-reference should be derivable from the contents of a concept. As "morning star" vs "evening star" shows, however, even if UE's (5) were true it would show nothing about materialism. Materialism does not assert that the physicality of a class of objects should be logically entailed by every concept characterizing the class. |
01-27-2002, 02:10 PM | #34 |
Veteran Member
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
|
It may be because we're coming at this from different sides Filip, but I don't see what I haven't addressed. That materialism implies one should be able to see a mental image on an MRI is a figment of your imagination. It is your problem, not mine.
|
01-27-2002, 02:22 PM | #35 |
Senior Member
Join Date: Feb 2001
Location: Toronto
Posts: 808
|
I just want to inject a quick note that may be of use here as an analogy.
The mind as a function of the brain needs to be understood properly to understand materialisim. The printer example came up as an example which illustrates that the brain does not produce something that is 'showable'. I think a better analogy would be the CPU of a computer with the monitor unplugged. There is no way to know what the chip is 'thinking' by looking at an electron-path-scan of the CPU (unless you know everything about the chip, and even then its a monumental task). The software is still purely a function of the chip (much like the mind), but there is no way to know what the chip is thinking. This is a good way to understand the materialist view of mind, since software is also an emergant function of complex hardware. The primary difference is that we write our own mind as we grow up. |
01-27-2002, 08:27 PM | #36 |
Senior Member
Join Date: Jun 2000
Location: Tucson, AZ USA
Posts: 966
|
Unfortunately, we didn't design the hardware, so how it works is still a mystery to us. Thank you for a much better analogy.
|
01-27-2002, 08:41 PM | #37 | |||||
Regular Member
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
|
Quote:
However, I have trouble imagining that neural nets can 'recognize' anything at all; just as I would have trouble imagining that a vehicle is aware of the parts that it 'utilizes'.. but that's a whole other debate. Quote:
Maybe I am under the illusion that I possess free will.. but again, that's a whole other debate too. Without getting sidetracked though, by the details of what may possess free will or exactly what perceives things, I think it is quite apparent that we are lacking physically evidence of the existence of mental phenomena... but does this mean that mental phenomena doesn't exist? Is this 'converstation' not real? How can we be aware of something that doesn't exist?? Quote:
Quote:
I would conclude that this is still true today. <img src="graemlins/boohoo.gif" border="0" alt="[Boo Hoo]" /> It's very intriguing however, at least to me.. that even though it is so difficult and maybe even impossible to give a verbal definition of awareness, it is probably one of the things which we all know most intimately! This really goes to show just how limitted our language really is. Quote:
|
|||||
01-27-2002, 09:10 PM | #38 | |||
Regular Member
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
|
Quote:
As far as anyone who is looking in your brain can tell, you don't have a mind at all. This seems suggest, at least to me, a major flaw in materialism - The view that everything that actually exists is material, or physical. [ Note: Many philosophers and scientists now use the terms `material' and `physical' interchangeably. ] Quote:
I personally don't believe that mental phenomenon (ie. thoughts, feelings, wills, desires, etc.) are conscious (or aware) of anything. Quote:
Materialism just doesn't seem to make any sense. [ January 27, 2002: Message edited by: Filip Sandor ]</p> |
|||
01-27-2002, 09:39 PM | #39 | |
Regular Member
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
|
Quote:
My analogy is actually quite simple -- it only appears complex when you apply it to mental phenomena. In essence, you can label just about anything as a 'quality' or 'property' of something, which tends to get quite confusing. To put it more simply, I am exploiting the fact that there is a major difference in the 'qualities' or 'properties' of any mental phenomena and the 'qualities' or 'properties' of the physical phenomena that it corresponds to in the brain, whatever that may be. Essentially.. we are faced with a consequential dilemma; if the mind is really a physical thing and it is in the brain then either the neuro-scientist who is looking at your brain is having a mass hallucination or you are having a mass hallucination. The only other alternative is that neither of you is hallucinating and that both of you are actually perceiving different phenomena. |
|
01-27-2002, 09:57 PM | #40 | ||
Regular Member
Join Date: Mar 2001
Location: Vancouver, British Columbia, Canada
Posts: 181
|
Quote:
Quote:
P.S. No, I am not suggesting that we should see mental images on an MRI scan -- I am not a materialist. |
||
Thread Tools | Search this Thread |
|