FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 05-23-2003, 05:58 PM   #31
Veteran Member
 
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
Default

ComestibleVenom,

I sense some confusion. You maintain that:

"The other extreme is the qualia that Daniel Dennett denies. No, these theories cannot in fact be reduced insofar as they are factually incorrect. If it cannot be reduced, appeals to non-physical entities or extraneous ontological whisps are inevitable."

I dont doubt that consciousness is be causally reduced. There are, as I have said, very good reasons to believe that consciousness cannot be *ontologically* reduced. These are two very different things, neither of which necessarly appeals to any sort of magical ooze. It's simply the recognition that, no matter how hard you try, there are always going to be two types of physical things: those that exist in the third person, and those that exist in the first.

You continue:

"Their position is, however, much closer to scientific practice. Thus, despite the popular appeal of their folk-psychological intuitions, despite the appeal of the idea that humans have limited infallibility, such opinions can only become more irrelevant as the neurosciences develop."

I dont think this is true at all. As far as specific research programs go, do you really think the difference between the statements "X causes Y" and "X is Y" have much of a practical importance?

And then:

"There is *no such thing as a first-person cause*. Every word you say about this first person perspectives isolated epistemically from the third-person world, is itself caused by nothing other than that third person world. The reduction is inevitable unless an appeal to ghosts is made."

Two points. First, i never mentioned a "first-person cause". Rather, i mentioned third-person phenomena (the relevant workings of the brain) causing first-person phenomena. I suppose there is a way in which you can say mental events (first-person) cause non-mental physical events (third person, like bodily motion), but its simply a higher-order, "macro" way of looking at the issue, and not strictly true (as the first-person events are caused by the third-person in the first place).

Second, there is some confusion here between the ontological issue and the epistemic one. It does not follow that because consciousness exists in the first-person ontologically, it cannot be the subject of third-person scientific inquery. These are two logically and conceptually distinct areas. Dennett repeats this error over and over in his work.

And finally:

"Secondly, you are totally misrepresenting Dennett. He denies only ideosyncratic and contradictory notions of what a first person mental state is. Namely those that are intrinsic, ineffable and infallible."

Dennett denies consciousness as traditionally understood and obviously true: that there are subjective, first-person events going on inside my head (pinch yourself if you dont believe me!).

For example, from Consciousness Explained:

"Why should a "zombie's" crushed hopes matter less than a conscious person's crushed hopes? There is a trick with mirrors here that should be exposed and discarded. Consciousness, you say, is what matters, but then you cling to doctrines about consciousness that systematically prevent us from getting any purchase on why it matters. Postulating special inner qualities that are not only private and intrinsically valuable, but also unconfirmable and uninvestigatable is just obscurantism" (450)

...the point here being, and what sets Dennett apart from other Strong AIers, is not that there could be no unconsciousness zombies, because any machine, regardless of what is made of, that behaved like us would be conscious. No sir! What Dennett is instead claiming is that *we* are zombies; that there is no difference between us and a zombie that lacks conscious states.

The claim is not that zombies could be conscious if endowed with the right program, but that consciousness, per se, do not exist; there are only complex states of zombiehood. For Dennett, "consciousness" is no longer consciousness (it doesnt exist, remember?), but something entirely different.

As for me, i'll have to conclude with Dr. Searle when he says:

"I regard Dennett's denial of the existence of consciousness not as a new discovery or even a serious possibility but rather as a form of intellectual pathology."

-GFA
God Fearing Atheist is offline  
Old 05-23-2003, 07:00 PM   #32
Veteran Member
 
Join Date: May 2001
Location: US
Posts: 5,495
Default

Quote:
Originally posted by God Fearing Atheist
Dennett denies consciousness as traditionally understood and obviously true: that there are subjective, first-person events going on inside my head (pinch yourself if you dont believe me!).

For example, from Consciousness Explained:
GFA: I agree, for me while the book was informative and entertaining the title is a misnomer - it doesn't explain consciousness.
John Page is offline  
Old 05-23-2003, 07:03 PM   #33
Veteran Member
 
Join Date: Apr 2003
Location: British Columbia
Posts: 1,027
Default Re: Hi

Quote:
Originally posted by Thomas Ash
IIt sure feels like it comes from experience, and that's pretty hard to argue against.
True, and I believe that belief in qualia does come from experience. Our perceptions, which are physical and causal result in our belief in things like qualia. This is a false belief, but it is still caused by what it seems to be caused by.

Quote:

But that's just Identity Theory, and I've already listed the flaws of that (and others have covered them yet further.)
I think that in general these arguments claim that we know that there must be something more than the third-person physical because we experience it. But as I say, all arguments of this type are invalid for the stated reason. Of course, its still interesting to talk about why people find them persuasive.

To Comestible Venom,

I don't think Dennett denies the existence of consciousness, and I don't think you have interpreted the quoted passage in the way he intended. The passage assumes the stand-point of someone who believes in non-physical consciousness, and then argues that this makes it hard to say anything useful about consciousness. It is non-physical consciousness that he derides as unconfirmable.
sodium is offline  
Old 05-23-2003, 07:21 PM   #34
Veteran Member
 
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
Default Re: Re: Hi

Quote:
Originally posted by sodium
To Comestible Venom,

I don't think Dennett denies the existence of consciousness, and I don't think you have interpreted the quoted passage in the way he intended. The passage assumes the stand-point of someone who believes in non-physical consciousness, and then argues that this makes it hard to say anything useful about consciousness. It is non-physical consciousness that he derides as unconfirmable.
I can only assume this was intended for me:

What Dennett is obviously denying isnt "non-physical consciousness". Substance dualists are the only folks who accept that sort of hokey nonsense.

Dennett is denying consciousness as *subjective and first person*, which is 1) obviously false and 2) as pointed out above, perfectly acceptable as the object of third-person *epistemic* study.

He's saying a zombie's "feelings" matter because we too are zombies.

-GFA
God Fearing Atheist is offline  
Old 05-23-2003, 09:55 PM   #35
Veteran Member
 
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
Default Re: Re: Re: Hi

Quote:
Originally posted by God Fearing Atheist

Dennett is denying consciousness as *subjective and first person*, which is 1) obviously false and 2) as pointed out above, perfectly acceptable as the object of third-person *epistemic* study.
I am very familiar with Dennett, and have carefully read a substantial portion of his published work. I know for a fact that his tactic of denying things like that is merely to break down stubborn presuppositions surrounding those terms.

Obviously he understands the heuristic truth of First Person language games, but he demonstrates that they constitute an undesirable and confusing foundation for philosophy of mind.
ComestibleVenom is offline  
Old 05-23-2003, 10:40 PM   #36
Veteran Member
 
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
Default

God Fearing Atheist, thank you for your thoughful replies.

Quote:
Originally posted by God Fearing Atheist
I dont doubt that consciousness is be causally reduced. There are, as I have said, very good reasons to believe that consciousness cannot be *ontologically* reduced. These are two very different things, neither of which necessarly appeals to any sort of magical ooze. It's simply the recognition that, no matter how hard you try, there are always going to be two types of physical things: those that exist in the third person, and those that exist in the first.
Do you really know what you mean by ontology? I would appreciate if you clarify it, because I can't make any sense of the notion that a physical, publicly accessible brain would have some very special ontological feature. What does this brand new ontological quality do?

Nothing. All the explanatory, functional, structural work is done within normal science. I realize you claim to invoke no soul, but this invisible feature sounds suspiciously like one.

Either consciousness is public in which case science can study it without adding to our ontology, or it is private, in which case it is hidden from view. Comitting to a private view of consciousness makes your position vulnerable to Dennett's philosophical criticisms, (along with those of the Wittgensteinians.)

Your contention that there are 'two types of physical things' strikes me as being an inclination to embrace both a private and public view of consciousness. To be a physical thing implies access, public access. You seem to be invoking a private physical existence, this 'first person' experience. The problem is that the human knowledge about 'first person' phenomenon fundamentally relies upon linguistic aided cognition, upon background reasoning aquired with environmental input.

Since this 'first person' knowledge relies upon - both causally and epistemically - access to the outside world, it is not a private language. It's nature, structure and indeed, it's confusions, can be studied scientifically in the same way that any other merely public physical object can be.


Quote:
I dont think this is true at all. As far as specific research programs go, do you really think the difference between the statements "X causes Y" and "X is Y" have much of a practical importance?
Good question! The practical implications of philosophy (and the philosophical implications of practical reality) is subtle, and probably beyond our full grasp.

I think in that particular example, you're right. I honestly don't believe those make as much difference as they're cracked up to.

However, the subtle and frankly eliminatable differences in terminology disguise some definitely substantial questions about the nature of knowledge. (The issue of a private language is very much bound up with the epiphenomenal inclination to assert 'the brain causes consciousness')

How we think about thinking about knowledge IS directly relevant to scientific practice, and a lucid position on it can eliminate many false leads and simplify baffling quandries.

Quote:
Two points. First, i never mentioned a "first-person cause". Rather, i mentioned third-person phenomena (the relevant workings of the brain) causing first-person phenomena. I suppose there is a way in which you can say mental events (first-person) cause non-mental physical events (third person, like bodily motion), but its simply a higher-order, "macro" way of looking at the issue, and not strictly true (as the first-person events are caused by the third-person in the first place).
This seems to be straight-forward epiphenomenalism. If your first person occurances actually exist, then they necessarily CAUSED you in some way to talk about them. But they don't, only ('third person') physical systems cause you to speak about 'first person' events.

Quote:
Second, there is some confusion here between the ontological issue and the epistemic one. It does not follow that because consciousness exists in the first-person ontologically, it cannot be the subject of third-person scientific inquery. These are two logically and conceptually distinct areas. Dennett repeats this error over and over in his work.
I don't follow what you even mean by ontology. What reason is there for this addition to the ontology if it cannot be seen, measured or rationally inferred from any scientific evidence?

You are crucially misunderstanding Dennett because it's very clear that his use of 'qualia' and in some contexts 'first person' relate directly back to the whole idea of a private language. If consciousness is not private, it is a publicly accessible phenomenon like any other, thus no ontological magic, no soul or anything of that sort is necessary. If it IS private, it is fundamentally hidden from scientific observation. You cannot have it both ways.

The problem, which Dennett very effectively illuminates, is that 'conscious phenomenon' are NOT, in fact, hidden from view.

They are complex, ideosyncratic, but certaintly nothing in the order of ontological anomalies.

Quote:
Dennett denies consciousness as traditionally understood and obviously true: that there are subjective, first-person events going on inside my head (pinch yourself if you dont believe me!).
*pinch* Ohh, that was a sharp pain that faded quickly when the pressure was removed.

I'm still as conscious as ever, and I still deny that there is any such thing as a first person event independent in any meaninful ontological way from the rest of the world.

Quote:
...the point here being, and what sets Dennett apart from other Strong AIers, is not that there could be no unconsciousness zombies, because any machine, regardless of what is made of, that behaved like us would be conscious. No sir! What Dennett is instead claiming is that *we* are zombies; that there is no difference between us and a zombie that lacks conscious states.
Think about it. From what I have revealed to you about Dennett's position, it's just as rational to say that we are zombies as to say that zombies don't exist. We are zombies in that we lack private qualia, zombies don't exist in that consciousness is not the sort of thing that is in any way independent of the physical world.


Quote:
"I regard Dennett's denial of the existence of consciousness not as a new discovery or even a serious possibility but rather as a form of intellectual pathology."
This misunderstanding is really too much for me. Searle is unable to even imagine a different idea of consciousness than his own presuppositions lead him to.

I would concur with Dennett's diagnosis. This misunderstanding is nothing less than a failure of the human imagination.
ComestibleVenom is offline  
Old 05-23-2003, 11:48 PM   #37
Veteran Member
 
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
Default

ComestibleVenom,

Let me try to clear up what I mean a bit:

Ontological subjectivity:

This refers to mode of existence of conscious states. All conscious states are *someones* conscious states. Just as i have a special relationship to my conscious states (they are *mine*) which is not like my relation to other people's conscious states, they have a relation to their conscious states which is not at all like my relation to theirs. The world has no points of view (it is ontologically third-person), by my view of the world does (consciousness is first-person). There are some good reasons to believe that consciousness is irreducibly like this. I will lay out a couple of the reasons:

The logical possibility of zombies:

As i mentioned before, it is logically possible that there exists a zombie who is biologically or otherwise functionally indentical to me -- who has all the same behavioral patterns, speaks the same words, in the same tones, etc -- but lacks any sort of inner mental states. This is not to say its possible in a natural sense; its probably causally impossible that someone with an indentical brain would not also be conscious. What it does show is that conscious states cannot be *ontologically reduced* to the third person. If they could, zombies would be a logical impossibility, and i dont see how thats the case.

The inverted spectrum:

Its logically possible that a being could be biologically indentical to me, but have inverted conscious expereinces. For example, my inverted twin could be seeing the color blue where i see the color red. That he *calls* his expereinces "red" is irrelevant. What matters is that he experiences the same things we both call "red" -- the top of my pack of Marlboros, for example -- the same way i experience things we call "blue". Again, i think the natural plausability of this is absurd, but it shows that consciousness does not *logically* supervene on the third-person physical.

None of this seems to me a "failure of the human imagination." It is simply a reporting of the facts.

Epistemic objectivity:

Although consciousness may exist in the first-person, nothing about it as an object of scientific study necessarly follows. Consider the statement, "i feel happy right now." Happiness is a first-person conscious state; it is ontologically subjective. But, it is also a statement of emperical fact that exists independent of your, or anyone elses feelings about it. There is no reason, it seems to me, that this fact cannot be studied like any other fact regardless of its ontology. If it cant be, why?

I hope this made sense. Im really tired and pretty drunk.

-GFA
God Fearing Atheist is offline  
Old 05-24-2003, 09:57 AM   #38
Veteran Member
 
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
Default

Quote:
Originally posted by God Fearing Atheist
Ontological subjectivity:

This refers to mode of existence of conscious states. All conscious states are *someones* conscious states. Just as i have a special relationship to my conscious states (they are *mine*) which is not like my relation to other people's conscious states, they have a relation to their conscious states which is not at all like my relation to theirs. The world has no points of view (it is ontologically third-person), by my view of the world does (consciousness is first-person).
Although I have some quibbles with your semantics, I have no problem believing that you have a special relationship to your conscious states. The reasons are purely physical. You are your brain, intricately intertwined with yourself unlike anyone could be connected to you.

Quote:
As i mentioned before, it is logically possible that there exists a zombie who is biologically or otherwise functionally indentical to me -- who has all the same behavioral patterns, speaks the same words, in the same tones, etc -- but lacks any sort of inner mental states. This is not to say its possible in a natural sense; its probably causally impossible that someone with an indentical brain would not also be conscious.
IF we are dealing with unconscious creatures that are logically possible and physically identical to humans THEN consciousness is not physical.

But the problem is that all the evidence suggests consciousness as a biological phenomenon. Nonphysical consciousness is something we could all do without and never miss. The only sort of consciousness that makes any difference (to us or anyone else) is the sort that would mean zombies are logical contradictions, QED.

Quote:
Its logically possible that a being could be biologically indentical to me, but have inverted conscious expereinces... Again, i think the natural plausability of this is absurd, but it shows that consciousness does not *logically* supervene on the third-person physical.
It shows no such thing. It simply involves an absurd (and, according to science has indicated, probably referentially incoherent) hypothetical possibility that would be in principle impossible to detect.

Smells an awwful lot like a soul.

Quote:
None of this seems to me a "failure of the human imagination." It is simply a reporting of the facts.
Both thought experiments either presuppose their own conclusions or are incoherent. Neither are useful or in any way cogent arguments.

Quote:
Happiness is a first-person conscious state; it is ontologically subjective... There is no reason, it seems to me, that this fact cannot be studied like any other fact regardless of its ontology. If it cant be, why?
The problem is that you are mixing up the idea of private and public knowledge. If consciousness is public, there is no special epistemological problem here. No zombies, no inverted qualia are possible.

Since you are invoking zombies and such, it's clear that you mean to suggest that consciousness is private. If it is private in the relevant manner suggested (and required!) by your arguments, then it is not aminable to any sort of scientific study.
ComestibleVenom is offline  
Old 05-24-2003, 06:17 PM   #39
Veteran Member
 
Join Date: May 2001
Location: US
Posts: 5,495
Default

Quote:
Originally posted by ComestibleVenom
Since you are invoking zombies and such, it's clear that you mean to suggest that consciousness is private. If it is private in the relevant manner suggested (and required!) by your arguments, then it is not aminable to any sort of scientific study.
How about dying as a substitute for consciousness in the above?
John Page is offline  
Old 05-25-2003, 03:49 AM   #40
Senior Member
 
Join Date: Jan 2002
Location: Farnham, UK
Posts: 859
Default

I'd like to offer some potential answers or more precisely, a conceptual model that can overcome some of the problems that people have had here with Identity Theory or indeed materialism in general.

My comments arise almost directly out of a book by an Edgar Wilson, 'The Mental as Physical' and the book is essentially a defence of the Biperspectival Identity Theory. I have found it quite compelling with regard to solutions to the common problems IT theories face, so I unashamedly attempt to expound its virtues here Indeed, I'm quite taken with this theory and as such apologise in advance for any kind of 'fanboy' feel to this post. I do hope that it offers a perspective that moves people further in their thinking on this issue that I'm deeply interested in.

In short, the BIT is the view that the mind and brain are identical, insofar as they are the same thing understood from two different perspectives, the first and third person perspective. i.e. the mind is simply the brain's awareness of its own processes and those of its extended sensory network and the environment, while the brain perspective is the one we get when we are not the particular brain as it works, this is explored further down in the paragraphs on 'channels of communication'.

Perhaps the easiest way to offer its insights and arguments is to address some of the comments made so far and expand from there.

Preamble

Firstly, Wilson distinguishes between the ontologically dualistic account of Cartesian dualism and its derivatives (the OA or orthodox animistic) model and the Physicalist Objectivist account (or PO model) and no, its got nothing to do with Rand!

Also, and this is key to the way he approaches the problem, Wilson posits that:

Quote:
understanding experience will minimally in some way involve classification within a conceptual system i.e. the structure of categories, types and relations established by 'intuitive induction' out of the individual's experience
Classification of concepts within a conceptual system establishes patterns of categorial relations or "addresses". Understanding involves interpretation in terms of existing concepts. New experience will either modify them or extend them. 'Understanding' is thus synonymous with the categorising process:

Quote:
an individual's conceptual schemata will determine the interpretation he consciously places on a given experience...An adequate conceptual system may be defined as one that has the scope and precision to render possible a complete account of the individual's experience to his own satisfaction.
Thus the cognitive strength of a system derives from its comprehensiveness and mutual corroboration of its elements, suggesting to me that Wilson allies more with a coherentist account of knowledge.

What's important about this is that Wilson is not setting out to show that IT is a way of proving wrong a dualist account, or that Biperspectival IT is more right than other forms of non dualistic explanation; rather, that there is common experience that both OA models and PO models attempt to, well, model in terms of a coherent conceptual system. Wilson's aim is to take the problems and see if the PO model can be as 'perspicuous' as the OA model, and if so, he expects only that it will replace the OA model through making it redundant simply in virtue of it not being as useful for understanding experience when further experiences are assimilated (as we might argue has thus been the case in the last 100 years of neuropsychological, cognitive-scientific and neurophysiological research). Thus, he says that IT recommends itself for the following reasons:

Quote:
1 - It resolves problems inherent in the orthodoxy, namely:
(i) why mind and brain should be related at all;
(ii) why a particular mind (personality) should be related and confined to a particular body (brain);
(iii) why (putatively) categorially distinct substances (or 'domains') should interact;
(iv) how (putatively) categorically distinct substances (or 'domains') do interact.

2 - It fulfils the general requirements for an adequate resolution of problems inherent in the orthodoxy
(i) the condition of metaphysical conservatism is satisfied because an identity theory is monistic;
(ii) iti si true to experience because nothing is introduced and the phenomena of common experience are not reduced
my emphasis

Further on from this, with regard to my ongoing comments in this post, it also addresses some of the shortfalls with existing materialist accounts of the mind body problem, i.e. it isn't saying 'You are simply wrong' it is saying 'here's a model that accounts for the problem you seem to be having in getting your conceptual system to cohere better'. Whether that's the case is of course another matter

Thomas Ash:
Quote:
This is the problem that we can have thoughts about conciousness itself (as you are doing at this very moment.) The problem this poses for epiphenomenalism is: how did these thoughts get into your head in the first place. If the experience of conciousness is simply a kind of shadow to the light cast by the firing of neurons in the brain, then the knowledge that we are having this experience shouldn't be able to enter the causal chain of firing neurons in this place, any more than the light knows about or can be affected by the shadow it's casting.
Wilson attempts to model a solution here by positing that there are different channels of communication:

Quote:
The discriminations I make (in my interaction with the world) are not essentially different from the discriminations I observe others making. In my own case..I am a priviledged observer: What is different is that I have direct or non-inferential access to my internal processes. Indeed, in an important sense I am my internal processes. (emphasis mine) I do not normally have access to the internal processes of others becase there is normally no channel of communication open to me.
The channels might be said to end at the epidermis as regards the direct access channels. As you say Thomas one can view another's introspective process, but of course, this can only be done 'extradermally' i.e. one cannot be those processes, the 'mind' side of the BIT view of the brain, one can only observe the goings on externally, i.e. see it in the third person, which is the 'brain' side of the understanding of the phenomena which we happen to call a brain.

As to your problem, Wilson would therefore say that the brain being a network of systems can 'self scan' its own states, which in common parlance he identifies with 'introspection' or 'self consciousness'. The brain is not something that can only have experience from outside itself, i.e. scan the environment, it can also scan itself too. The inner and outer channels of access to its processes are integrated into the same brain.

Thomas Ash:
Quote:
There is a world of difference between the physical firing of a neuron and another neuron responding to this electrical signal, and a concious mental event. However, perhaps mental events, which are after all little understood, simply do equate to physical events and are an integral part of them. This would provide a convenient explanation of how we are able to think about our mental events - it simply equates to thinking about our neural events. One problem with Identity Theory is that it would imply that every physical event, from the 'decisions' of a vending machine to the action of a doorhinge, has a mental correlate. The other problem is that it is very hard to prvide evidence for: we don't have access to the mental events physical actions other than those in our brain do or do not cause.
As understood on the BIT, the difference is a phenomenological one, it is simply the difference between being the system and observing the system. When you are the system, the having of neruons firing is the having of conscious experience. 'Conscious experience' is a term we use to describe what it is to be a brain full of sets of neural systems correlating together in a self scanning recursively interacting neural 'supersystem'.

The problem of access is only resolved by referring to the first person accounts of their experience as we perhaps prod their brains. However, the point with BIT is that we don't attempt to reduce the mental events, rather, we see them as first person reportage of physical events that are phenomenologically different simply in virtue of the mode of access.

With regard to mental predicates for all physical things, I do not see how that follows. If we are talking about mental predicates, or we predicate, on the traditional OA view of the mind people as having 'goal directedness', 'will', 'intentionality', 'memory' etc. we are talking about things that, in the PO model can be understood in more neutral terms. Key to this is the concept of relations. It is the relations of the neurons that are as important as the neurons themselves, after all, an aggregate of neurons is not a system of neurons, and we only come to understand something as a system if we posit of it that it has a coherence and relational structure, the organisation of subsystems affects their character, my brain is probably organised slightly differently to yours, with regard to our subsystems, due to our different experiences etc. so we have a different consciousness.

To take your two examples, on the above account we can say that they are systems, and that their organisation and relations are different to the raw materials that are constitutive of them as aggregates in a pile on the floor for example. However, it is a mistake to posit mental predicates of them because there is no meaningful sense in which we can say there is a mental life, there is no 'first person perspective' of a doorhinge, because the idea of this is inappropriate to the system, and crucially is not corroborated by the system. How we understand and go about our lives using the terms we associate with the concept of consciousness relate to certain sets of systems in the universe, namely, ours. The idea only works because we have a problem wondering how there is self perspective in more complex neural systems like ours. This is a feature of the relational organisation of brains, not of doorhinges, so to try to map the concept of 'mentallings' to these objects is inappropriate.

If experience suggested to us, when we try to provide a coherent model by which to understand it, that doorhinges could be sad, or vending machines bored, I'm sure we'd have a problem with regard to our concept of the importance of physical interrelations to conscious life, and might need to revisit the OA model. However, if consciousness is evident as a result of certain sorts of interrelations of certain sorts of matter, evident due to the ability of those organisations of matter (humans) to express as such, then I don't see how we must of necessity go positing consciousness or mental predicates of entirely different and far cruder mechanistic objects. I would be interested to see how someone can take a vending machine and offer a first person perspective of its 'experience' indeed, why even the term should apply and why it could not be construed as a misapplication.

BIT isn't saying that because all mental events are merely physical, then all physical events have a mental correlate, rather, mental events is a name for the phenomenologically distinct experience that certain sorts of organism have, namely, human beings. Because human beings are not doorhinges, we must have some ground for wondering why our concepts used to describe the nature of what it is to be a brain should apply equally to some question of 'what it is to be a doorhinge'.

ComestibleVenom outlined this response more efficaciously than I.

God-fearing Atheist:
Quote:
That the mind is *ontologically* distinct from what goes on in the brain is, I think, clear enough.

Otherwise, you're commited to absurd conclusions like the logical impossibility of zombies, that brain and mind are rigid designators (ala Kripke's argument against indentity theory), that a complete knowledge of neurophysiology would allow us to understand conscious phenomenon without first having experienced them (ala Jackson), and so forth.
I don't think it is, I think its phenomenologically distinct, not ontologically distinct. Brain and mind are rigid designators, on Wilson's view he compares this idea with that of Frege's distinction between sense and referent. The referent of the contingently different objects 'morning star' and 'evening star' is the same, and on Wilson's view, we contingently 'know' there is a difference because of the difference in being a first person and experiencing other's first person states in the third person. Yet, two descriptions for a brain state, from the direct access and the external access (perhaps using some sort of prostheses like MRI) are two rigid designators for the same phenomenon.

The counter is that the morning and evening stars are both physical, and the question under issue is precisely how a mental and a physical phenomena can rigidly designate the same thing. However, as outlined in the preamble, Wilson is working from the basis that there is only one ontological realm, the physical, and seeing whether this can account for the problem. By setting up the idea that the two could be the same, call it a working hypothesis, the problem is to see where this hypothesis would break down. If we don't assume the mental is a separate ontologically distinct set of goings on, and we introduce the concepts of 'access' and biperspectivism, we do appear to have the beginnings of a coherent model that fits well with other scientific research and does not postulate extraneous entities, i.e. it is at least as parsimonious. Thus, the dualist's criticism would be misplaced, because Wilson isn't working with their OA basis.

Also, biperspectivism is inclusive of 'mental' predicates, it does acknowledge them as meaningful and they thus participate in the overall model of the mind brain relation. Wilson would argue that one could not have a complete knowledge of consciousness from a third person access perspective only, because it seems to arbitrarily discount the first person account of experience.

Godfearing Atheist:

Quote:
point out that it is very unlikely that there are (assuming the coherence of IT in the first place) only one possible type of neurophysiological state with which mental states are indentical. Consider, for example, non-human animals, or space aliens with a completely different sort of biology.
On the BIT account, this criticism is misplaced. All systems have interrelations and these determine and affect the character of the systems themselves. Insofar as mental predicates are first person access reports of brain states (on this PO model) then insofar as there are similar non human systems that have interrelations such that concepts of 'self consciousness' 'pattern creation' 'goal directedness' seem to apply (proved or postulated through experience of these non human systems) then I dont' see why the model can't support the notion that there exist streams of states in other central nervous systems that from the perspective of that nervous system could be classified as mental, if that system chose to apply our language game relating to what 'mental' means to us that is! Sorry, I'm getting tired.

Quote:
i think consciousness is a global property of the brain; a causal result of the workings of neurons, synapses and so forth, but one that cannot be ontologically reduced to those processes.
I disagree, I think it is nothing more than the processes and their interrelations. There is no ontological distinctness, only phenomenological, on Wilson's account.

ComestibleVenom:
Quote:
Since the brain is chemicals, uncontroversially so, it can be reduced. Everything the brain DOES can be reduced to physics.
Entirely true. For Wilson, its important to acknowledge that reducing things down is not useful in all cases. He adopts the systems philosophy view of Ervin Laszlo:

Quote:
it postulates a hierarchical continuum of systems...from subatomic particles, atoms...multicellular organisms and ultimately societies."
As such, where in the hierarchy you want your explanation of what's going on to come from is determined by what part of the hierarchy you're interested in, as such, we're probably interested at the multicellular level, rather than the subatomic, but this is not to say that in some sense the one system isn't more 'basically' described in terms of the other. Reduction is of course possible, given physicalism, it just isn't efficacious for all purposes. Why I admire this view is that it doesn't 'value' a reduction, it merely acknowledges it as possible, but within the overall conceptual system we have, reduction of mental predicates to physical ones does not serve any good purpose, and we lose part of the 'whole picture' of experience, which of course is made up of different perspectives.

Anyway, that's got all that off my chest

Adrian
Adrian Selby is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 06:17 AM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.