Freethought & Rationalism ArchiveThe archives are read only. |
05-23-2003, 05:58 PM | #31 |
Veteran Member
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
|
ComestibleVenom,
I sense some confusion. You maintain that: "The other extreme is the qualia that Daniel Dennett denies. No, these theories cannot in fact be reduced insofar as they are factually incorrect. If it cannot be reduced, appeals to non-physical entities or extraneous ontological whisps are inevitable." I dont doubt that consciousness is be causally reduced. There are, as I have said, very good reasons to believe that consciousness cannot be *ontologically* reduced. These are two very different things, neither of which necessarly appeals to any sort of magical ooze. It's simply the recognition that, no matter how hard you try, there are always going to be two types of physical things: those that exist in the third person, and those that exist in the first. You continue: "Their position is, however, much closer to scientific practice. Thus, despite the popular appeal of their folk-psychological intuitions, despite the appeal of the idea that humans have limited infallibility, such opinions can only become more irrelevant as the neurosciences develop." I dont think this is true at all. As far as specific research programs go, do you really think the difference between the statements "X causes Y" and "X is Y" have much of a practical importance? And then: "There is *no such thing as a first-person cause*. Every word you say about this first person perspectives isolated epistemically from the third-person world, is itself caused by nothing other than that third person world. The reduction is inevitable unless an appeal to ghosts is made." Two points. First, i never mentioned a "first-person cause". Rather, i mentioned third-person phenomena (the relevant workings of the brain) causing first-person phenomena. I suppose there is a way in which you can say mental events (first-person) cause non-mental physical events (third person, like bodily motion), but its simply a higher-order, "macro" way of looking at the issue, and not strictly true (as the first-person events are caused by the third-person in the first place). Second, there is some confusion here between the ontological issue and the epistemic one. It does not follow that because consciousness exists in the first-person ontologically, it cannot be the subject of third-person scientific inquery. These are two logically and conceptually distinct areas. Dennett repeats this error over and over in his work. And finally: "Secondly, you are totally misrepresenting Dennett. He denies only ideosyncratic and contradictory notions of what a first person mental state is. Namely those that are intrinsic, ineffable and infallible." Dennett denies consciousness as traditionally understood and obviously true: that there are subjective, first-person events going on inside my head (pinch yourself if you dont believe me!). For example, from Consciousness Explained: "Why should a "zombie's" crushed hopes matter less than a conscious person's crushed hopes? There is a trick with mirrors here that should be exposed and discarded. Consciousness, you say, is what matters, but then you cling to doctrines about consciousness that systematically prevent us from getting any purchase on why it matters. Postulating special inner qualities that are not only private and intrinsically valuable, but also unconfirmable and uninvestigatable is just obscurantism" (450) ...the point here being, and what sets Dennett apart from other Strong AIers, is not that there could be no unconsciousness zombies, because any machine, regardless of what is made of, that behaved like us would be conscious. No sir! What Dennett is instead claiming is that *we* are zombies; that there is no difference between us and a zombie that lacks conscious states. The claim is not that zombies could be conscious if endowed with the right program, but that consciousness, per se, do not exist; there are only complex states of zombiehood. For Dennett, "consciousness" is no longer consciousness (it doesnt exist, remember?), but something entirely different. As for me, i'll have to conclude with Dr. Searle when he says: "I regard Dennett's denial of the existence of consciousness not as a new discovery or even a serious possibility but rather as a form of intellectual pathology." -GFA |
05-23-2003, 07:00 PM | #32 | |
Veteran Member
Join Date: May 2001
Location: US
Posts: 5,495
|
Quote:
|
|
05-23-2003, 07:03 PM | #33 | ||
Veteran Member
Join Date: Apr 2003
Location: British Columbia
Posts: 1,027
|
Re: Hi
Quote:
Quote:
To Comestible Venom, I don't think Dennett denies the existence of consciousness, and I don't think you have interpreted the quoted passage in the way he intended. The passage assumes the stand-point of someone who believes in non-physical consciousness, and then argues that this makes it hard to say anything useful about consciousness. It is non-physical consciousness that he derides as unconfirmable. |
||
05-23-2003, 07:21 PM | #34 | |
Veteran Member
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
|
Re: Re: Hi
Quote:
What Dennett is obviously denying isnt "non-physical consciousness". Substance dualists are the only folks who accept that sort of hokey nonsense. Dennett is denying consciousness as *subjective and first person*, which is 1) obviously false and 2) as pointed out above, perfectly acceptable as the object of third-person *epistemic* study. He's saying a zombie's "feelings" matter because we too are zombies. -GFA |
|
05-23-2003, 09:55 PM | #35 | |
Veteran Member
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
|
Re: Re: Re: Hi
Quote:
Obviously he understands the heuristic truth of First Person language games, but he demonstrates that they constitute an undesirable and confusing foundation for philosophy of mind. |
|
05-23-2003, 10:40 PM | #36 | |||||||
Veteran Member
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
|
God Fearing Atheist, thank you for your thoughful replies.
Quote:
Nothing. All the explanatory, functional, structural work is done within normal science. I realize you claim to invoke no soul, but this invisible feature sounds suspiciously like one. Either consciousness is public in which case science can study it without adding to our ontology, or it is private, in which case it is hidden from view. Comitting to a private view of consciousness makes your position vulnerable to Dennett's philosophical criticisms, (along with those of the Wittgensteinians.) Your contention that there are 'two types of physical things' strikes me as being an inclination to embrace both a private and public view of consciousness. To be a physical thing implies access, public access. You seem to be invoking a private physical existence, this 'first person' experience. The problem is that the human knowledge about 'first person' phenomenon fundamentally relies upon linguistic aided cognition, upon background reasoning aquired with environmental input. Since this 'first person' knowledge relies upon - both causally and epistemically - access to the outside world, it is not a private language. It's nature, structure and indeed, it's confusions, can be studied scientifically in the same way that any other merely public physical object can be. Quote:
I think in that particular example, you're right. I honestly don't believe those make as much difference as they're cracked up to. However, the subtle and frankly eliminatable differences in terminology disguise some definitely substantial questions about the nature of knowledge. (The issue of a private language is very much bound up with the epiphenomenal inclination to assert 'the brain causes consciousness') How we think about thinking about knowledge IS directly relevant to scientific practice, and a lucid position on it can eliminate many false leads and simplify baffling quandries. Quote:
Quote:
You are crucially misunderstanding Dennett because it's very clear that his use of 'qualia' and in some contexts 'first person' relate directly back to the whole idea of a private language. If consciousness is not private, it is a publicly accessible phenomenon like any other, thus no ontological magic, no soul or anything of that sort is necessary. If it IS private, it is fundamentally hidden from scientific observation. You cannot have it both ways. The problem, which Dennett very effectively illuminates, is that 'conscious phenomenon' are NOT, in fact, hidden from view. They are complex, ideosyncratic, but certaintly nothing in the order of ontological anomalies. Quote:
I'm still as conscious as ever, and I still deny that there is any such thing as a first person event independent in any meaninful ontological way from the rest of the world. Quote:
Quote:
I would concur with Dennett's diagnosis. This misunderstanding is nothing less than a failure of the human imagination. |
|||||||
05-23-2003, 11:48 PM | #37 |
Veteran Member
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
|
ComestibleVenom,
Let me try to clear up what I mean a bit: Ontological subjectivity: This refers to mode of existence of conscious states. All conscious states are *someones* conscious states. Just as i have a special relationship to my conscious states (they are *mine*) which is not like my relation to other people's conscious states, they have a relation to their conscious states which is not at all like my relation to theirs. The world has no points of view (it is ontologically third-person), by my view of the world does (consciousness is first-person). There are some good reasons to believe that consciousness is irreducibly like this. I will lay out a couple of the reasons: The logical possibility of zombies: As i mentioned before, it is logically possible that there exists a zombie who is biologically or otherwise functionally indentical to me -- who has all the same behavioral patterns, speaks the same words, in the same tones, etc -- but lacks any sort of inner mental states. This is not to say its possible in a natural sense; its probably causally impossible that someone with an indentical brain would not also be conscious. What it does show is that conscious states cannot be *ontologically reduced* to the third person. If they could, zombies would be a logical impossibility, and i dont see how thats the case. The inverted spectrum: Its logically possible that a being could be biologically indentical to me, but have inverted conscious expereinces. For example, my inverted twin could be seeing the color blue where i see the color red. That he *calls* his expereinces "red" is irrelevant. What matters is that he experiences the same things we both call "red" -- the top of my pack of Marlboros, for example -- the same way i experience things we call "blue". Again, i think the natural plausability of this is absurd, but it shows that consciousness does not *logically* supervene on the third-person physical. None of this seems to me a "failure of the human imagination." It is simply a reporting of the facts. Epistemic objectivity: Although consciousness may exist in the first-person, nothing about it as an object of scientific study necessarly follows. Consider the statement, "i feel happy right now." Happiness is a first-person conscious state; it is ontologically subjective. But, it is also a statement of emperical fact that exists independent of your, or anyone elses feelings about it. There is no reason, it seems to me, that this fact cannot be studied like any other fact regardless of its ontology. If it cant be, why? I hope this made sense. Im really tired and pretty drunk. -GFA |
05-24-2003, 09:57 AM | #38 | |||||
Veteran Member
Join Date: May 2003
Location: Alberta, Canada
Posts: 2,320
|
Quote:
Quote:
But the problem is that all the evidence suggests consciousness as a biological phenomenon. Nonphysical consciousness is something we could all do without and never miss. The only sort of consciousness that makes any difference (to us or anyone else) is the sort that would mean zombies are logical contradictions, QED. Quote:
Smells an awwful lot like a soul. Quote:
Quote:
Since you are invoking zombies and such, it's clear that you mean to suggest that consciousness is private. If it is private in the relevant manner suggested (and required!) by your arguments, then it is not aminable to any sort of scientific study. |
|||||
05-24-2003, 06:17 PM | #39 | |
Veteran Member
Join Date: May 2001
Location: US
Posts: 5,495
|
Quote:
|
|
05-25-2003, 03:49 AM | #40 | |||||||||||
Senior Member
Join Date: Jan 2002
Location: Farnham, UK
Posts: 859
|
I'd like to offer some potential answers or more precisely, a conceptual model that can overcome some of the problems that people have had here with Identity Theory or indeed materialism in general.
My comments arise almost directly out of a book by an Edgar Wilson, 'The Mental as Physical' and the book is essentially a defence of the Biperspectival Identity Theory. I have found it quite compelling with regard to solutions to the common problems IT theories face, so I unashamedly attempt to expound its virtues here Indeed, I'm quite taken with this theory and as such apologise in advance for any kind of 'fanboy' feel to this post. I do hope that it offers a perspective that moves people further in their thinking on this issue that I'm deeply interested in. In short, the BIT is the view that the mind and brain are identical, insofar as they are the same thing understood from two different perspectives, the first and third person perspective. i.e. the mind is simply the brain's awareness of its own processes and those of its extended sensory network and the environment, while the brain perspective is the one we get when we are not the particular brain as it works, this is explored further down in the paragraphs on 'channels of communication'. Perhaps the easiest way to offer its insights and arguments is to address some of the comments made so far and expand from there. Preamble Firstly, Wilson distinguishes between the ontologically dualistic account of Cartesian dualism and its derivatives (the OA or orthodox animistic) model and the Physicalist Objectivist account (or PO model) and no, its got nothing to do with Rand! Also, and this is key to the way he approaches the problem, Wilson posits that: Quote:
Quote:
What's important about this is that Wilson is not setting out to show that IT is a way of proving wrong a dualist account, or that Biperspectival IT is more right than other forms of non dualistic explanation; rather, that there is common experience that both OA models and PO models attempt to, well, model in terms of a coherent conceptual system. Wilson's aim is to take the problems and see if the PO model can be as 'perspicuous' as the OA model, and if so, he expects only that it will replace the OA model through making it redundant simply in virtue of it not being as useful for understanding experience when further experiences are assimilated (as we might argue has thus been the case in the last 100 years of neuropsychological, cognitive-scientific and neurophysiological research). Thus, he says that IT recommends itself for the following reasons: Quote:
Further on from this, with regard to my ongoing comments in this post, it also addresses some of the shortfalls with existing materialist accounts of the mind body problem, i.e. it isn't saying 'You are simply wrong' it is saying 'here's a model that accounts for the problem you seem to be having in getting your conceptual system to cohere better'. Whether that's the case is of course another matter Thomas Ash: Quote:
Quote:
As to your problem, Wilson would therefore say that the brain being a network of systems can 'self scan' its own states, which in common parlance he identifies with 'introspection' or 'self consciousness'. The brain is not something that can only have experience from outside itself, i.e. scan the environment, it can also scan itself too. The inner and outer channels of access to its processes are integrated into the same brain. Thomas Ash: Quote:
The problem of access is only resolved by referring to the first person accounts of their experience as we perhaps prod their brains. However, the point with BIT is that we don't attempt to reduce the mental events, rather, we see them as first person reportage of physical events that are phenomenologically different simply in virtue of the mode of access. With regard to mental predicates for all physical things, I do not see how that follows. If we are talking about mental predicates, or we predicate, on the traditional OA view of the mind people as having 'goal directedness', 'will', 'intentionality', 'memory' etc. we are talking about things that, in the PO model can be understood in more neutral terms. Key to this is the concept of relations. It is the relations of the neurons that are as important as the neurons themselves, after all, an aggregate of neurons is not a system of neurons, and we only come to understand something as a system if we posit of it that it has a coherence and relational structure, the organisation of subsystems affects their character, my brain is probably organised slightly differently to yours, with regard to our subsystems, due to our different experiences etc. so we have a different consciousness. To take your two examples, on the above account we can say that they are systems, and that their organisation and relations are different to the raw materials that are constitutive of them as aggregates in a pile on the floor for example. However, it is a mistake to posit mental predicates of them because there is no meaningful sense in which we can say there is a mental life, there is no 'first person perspective' of a doorhinge, because the idea of this is inappropriate to the system, and crucially is not corroborated by the system. How we understand and go about our lives using the terms we associate with the concept of consciousness relate to certain sets of systems in the universe, namely, ours. The idea only works because we have a problem wondering how there is self perspective in more complex neural systems like ours. This is a feature of the relational organisation of brains, not of doorhinges, so to try to map the concept of 'mentallings' to these objects is inappropriate. If experience suggested to us, when we try to provide a coherent model by which to understand it, that doorhinges could be sad, or vending machines bored, I'm sure we'd have a problem with regard to our concept of the importance of physical interrelations to conscious life, and might need to revisit the OA model. However, if consciousness is evident as a result of certain sorts of interrelations of certain sorts of matter, evident due to the ability of those organisations of matter (humans) to express as such, then I don't see how we must of necessity go positing consciousness or mental predicates of entirely different and far cruder mechanistic objects. I would be interested to see how someone can take a vending machine and offer a first person perspective of its 'experience' indeed, why even the term should apply and why it could not be construed as a misapplication. BIT isn't saying that because all mental events are merely physical, then all physical events have a mental correlate, rather, mental events is a name for the phenomenologically distinct experience that certain sorts of organism have, namely, human beings. Because human beings are not doorhinges, we must have some ground for wondering why our concepts used to describe the nature of what it is to be a brain should apply equally to some question of 'what it is to be a doorhinge'. ComestibleVenom outlined this response more efficaciously than I. God-fearing Atheist: Quote:
The counter is that the morning and evening stars are both physical, and the question under issue is precisely how a mental and a physical phenomena can rigidly designate the same thing. However, as outlined in the preamble, Wilson is working from the basis that there is only one ontological realm, the physical, and seeing whether this can account for the problem. By setting up the idea that the two could be the same, call it a working hypothesis, the problem is to see where this hypothesis would break down. If we don't assume the mental is a separate ontologically distinct set of goings on, and we introduce the concepts of 'access' and biperspectivism, we do appear to have the beginnings of a coherent model that fits well with other scientific research and does not postulate extraneous entities, i.e. it is at least as parsimonious. Thus, the dualist's criticism would be misplaced, because Wilson isn't working with their OA basis. Also, biperspectivism is inclusive of 'mental' predicates, it does acknowledge them as meaningful and they thus participate in the overall model of the mind brain relation. Wilson would argue that one could not have a complete knowledge of consciousness from a third person access perspective only, because it seems to arbitrarily discount the first person account of experience. Godfearing Atheist: Quote:
Quote:
ComestibleVenom: Quote:
Quote:
Anyway, that's got all that off my chest Adrian |
|||||||||||
Thread Tools | Search this Thread |
|