FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > Religion (Closed) > Non Abrahamic Religions & Philosophies
Welcome, Peter Kirby.
You last visited: Yesterday at 03:12 PM

 
 
Thread Tools Search this Thread
Old 11-02-2004, 12:27 PM   #71
Regular Member
 
Join Date: Jul 2004
Location: Chicago
Posts: 381
Default

Quote:
Originally Posted by Valmont
Well, argument by analogy is always a dangerous gambit. However, I think that I am able to furnish a better analogy, although as an illustration not as an argument. Joseph Malik is a human being. He certainly exists, therefore, in the sphere of moral discourse and his actions are subject to our moral evaluation. Moreover, we should maintain that other moral agents have a moral obligation to him.

One day, aboard a yellow submarine called the Leif Erikson Joe Malik comes to a profound realisation. He is a character in a book. The authors, Shea and Wilson, have heaped humiliation and tribulation upon him in the name of a good story.

So, my question is, do Shea and Wilson have a moral obligation to Joseph Malik?
Obviously, the problem here is that this is a fantasy. In the real world, characters in books cannot be moral agents, and authors do not stand in moral relationships to their characters (though they do to a person - if there is one - who provides the character basis). But it is precisely the premise of the fantasy that such a relationship does hold in this case (to wit that a character is real and fictitious simultaneously), so if you allow yourself some suspension of disbelief, the answer to the question must be, "yes, they do."

Now let me suggest how we can fix this analogy so that it does not suffer from the same defect. Suppose instead that Shea and Wilson are computer programmers from the future who have written a virtual reality world, in which there exists a virtual yellow submarine and a virtual, artifically intelligent, Joseph Malik. This Malik too is a character, so to speak, but he is a real being, not just words on paper. His humiliation and his tribulation are as real as those of the programmers. Do Shea and Wilson now have a moral obligation to him?

In fact, this analogy seems to get at the heart of the matter. You, Valmont, have a story about the origin and function of morality, which I think is (partly) mistaken. And I think that this analogy brings the mistake out.

It strikes me that morality is primarily a practice that allows agents with contrarian values and agendas to negotiate and co-ordinate them for mutual benefit. I think we agree about this much, though this is already simplifying greatly. However, morality does not work through threat of sanction and promise of benefit (though justice might). It works through persuasion and argument - it is a form of discourse. For a claim of moral responsibility to be made, it is sufficient that two agents be able to enter into moral discourse, that their values and agendas be mutually intelligible. Moral relationships are not power relationships. In the computer sim analogy, the programmers have all the power, there are no sanctions to be levied against them, and no benefit to be gained by acting morally. But they are still morally responsible, and we would still think them cruel if they mistreated an AI in their sim (more to the point, their creation would think them cruel, and rightly so).

Now notice I said that it is sufficient for moral responsibity that two agents be able to enter into moral discourse; it is not necessary. It is not necessary because we might reasonably be said to have a moral responsibility to things that cannot in any way shape or form engage us morally. We might have a moral responsibility to preserve nature, or not to destroy great works of art (I'm not saying we do. I'm saying someone could reasonably make that claim and I would be interested in discussing it.) More simply, we have a moral responsibility to all those paragons of amorality mentioned earlier - tigers, mentally handicapped people, and so on. Isn't it cruel, immoral, to torture dogs? But dogs do not share our society, and they do not share our moral conceptual scheme. You might say, I suppose, that the reason we do not torture dogs is for fear of sanction by dog-lovers. We recognize their agenda, acknowledge responsibility to them. But this is perverse - our responsibility is to the dog, not its owner.

You see where this is going. God cannot be exempt from moral evaluation simply because god is very powerful, or very alien. So long as god knows us, acknowledges our existence, he becomes subject to our evaluation. Of course, it might be that god doesn't give two shits about our evaluation. He might think that we are dumb, or immature. His values might simply be incommensurable with ours. But then we will say that he is evil, that he is our enemy - that is our only recourse when an alien will and agenda contrarian to ours is imposed on us, and where talk will not help.
All Hail Discordia! is offline  
Old 11-02-2004, 05:03 PM   #72
Banned
 
Join Date: Oct 2004
Location: Pacific Northwest
Posts: 10,066
Default

RE All Hail Discordia's post...

Yes, yes that's it exactly!
muidiri is offline  
Old 11-03-2004, 02:12 AM   #73
Regular Member
 
Join Date: Apr 2004
Location: UK
Posts: 374
Default

Quote:
Originally Posted by muidiri
In what ways does BOB differ from GOD?
I do not know as I cannot say I precisely know what a three-legged kurquet from Rigel 7 is. But, as I said, if he has a moral interest then he is different from God in an important way.

Quote:
Originally Posted by muidiri
Does god have no interest in morality holding sway in our world? If so he has no interest, wouldn't this mean he really doesn't care about the welfare of any human on this planet? If god directs us to act morally (the ten commandments)... doesn't that show that he has an interest in morality holding sway? How then can he not have a moral obligation to humans?
No. God does care about the welfare of humankind. He desires us to benefit from the government of morality. But He does not, Himself, have an interest in morality. He is not Himself a beneficiary of the government of morality. Therefore He exists outside of the moral contract.

Quote:
Originally Posted by All Hail Discordia!
It strikes me that morality is primarily a practice that allows agents with contrarian values and agendas to negotiate and co-ordinate them for mutual benefit. I think we agree about this much, though this is already simplifying greatly. However, morality does not work through threat of sanction and promise of benefit (though justice might). It works through persuasion and argument - it is a form of discourse. For a claim of moral responsibility to be made, it is sufficient that two agents be able to enter into moral discourse, that their values and agendas be mutually intelligible. Moral relationships are not power relationships. In the computer sim analogy, the programmers have all the power, there are no sanctions to be levied against them, and no benefit to be gained by acting morally. But they are still morally responsible, and we would still think them cruel if they mistreated an AI in their sim (more to the point, their creation would think them cruel, and rightly so).
I do not think that you have demonstrated that moral interest provides an insufficient definition of moral agency. At best you have shown that those who are moral agents might have moral obligations to those who are not. And I do not think you have shown that moral discourse provides a better definition.

For a start, it is not at all clear to me that Shea and Wilson do have an obligation to the computer program, Malik. You have assumed that they do and have used that assumption as the basis for your argument. This is mere question-begging. I do not doubt that Joe Malik could consider Shea and Wilson cruel or even immoral. The question is whether it is reasonable for him to do so.

Secondly, my definition of morality is not about a power relation. If one moral agent, one individual with a moral interest, has complete power over another, because they have a moral interest they have a moral obligation.

Now, as to your model: it fails because it is yet another Godless Divine Command theory. It provides an answer to the question, "What is morality?" It asserts who has a moral obligation and to whom. It utterly fails to address the how and the why. Again, it robs morality of its power and reduces it to a mere label.

Quote:
Originally Posted by All Hail Discordia!
Now notice I said that it is sufficient for moral responsibity that two agents be able to enter into moral discourse; it is not necessary. It is not necessary because we might reasonably be said to have a moral responsibility to things that cannot in any way shape or form engage us morally. We might have a moral responsibility to preserve nature, or not to destroy great works of art (I'm not saying we do. I'm saying someone could reasonably make that claim and I would be interested in discussing it.) More simply, we have a moral responsibility to all those paragons of amorality mentioned earlier - tigers, mentally handicapped people, and so on. Isn't it cruel, immoral, to torture dogs? But dogs do not share our society, and they do not share our moral conceptual scheme. You might say, I suppose, that the reason we do not torture dogs is for fear of sanction by dog-lovers. We recognize their agenda, acknowledge responsibility to them. But this is perverse - our responsibility is to the dog, not its owner.
It strikes me that I could make a very similar case for my argument. All that is necessary for one to be under a moral obligation is that one has a moral interest. It is not necessary for the one to whom we have a moral obligation to themselves be a moral agent.

Bear in mind, we have not always been so willing to extend our moral obligation to dogs. Even now, we recognise animal cruelty as a lesser moral offence than the torture of humans. There is no doubt that we can extend our moral sentiment beyond the scope of its original purpose (the prosperity and security of society). But this is an arbitrary and cultural extension not an absolute. In general, as human society and culture has progressed, our sense of moral obligation has become more and more general. We should not assume that because we now recognise a moral obligation to dogs that it was always present. It is simply that our moral sentiment has now developed to that state.

And, of course, this observation of moral progression in no way invalidates the definition of a moral agent as one who possesses a moral interest.
Valmont is offline  
Old 11-03-2004, 04:47 AM   #74
Banned
 
Join Date: Oct 2004
Location: Pacific Northwest
Posts: 10,066
Default

Quote:
Originally Posted by Valmont
I do not know as I cannot say I precisely know what a three-legged kurquet from Rigel 7 is. But, as I said, if he has a moral interest then he is different from God in an important way.
You shouldn't need to know what a three-legged kurquet from Rigel 7 is to draw the parallel.

Quote:
Originally Posted by Valmont
No. God does care about the welfare of humankind. He desires us to benefit from the government of morality. But He does not, Himself, have an interest in morality. He is not Himself a beneficiary of the government of morality. Therefore He exists outside of the moral contract.
I'm sorry, but your point is not clear. At one point you siad that moral obligations only exist between humans, so we can't assume that god has any moral obligation because he's not human. But you agree that if members of two completely separate species (in different star systems, no less) interact, then there can be a moral obligation between them as long as there is interest in morality "holding sway". Then you go on to say that god cares about morality, but isn't bound by it.

What exactly do you mean by "moral interest" and what exactly do you mean by "holding sway"? I thought I understood... but I fear I am actually drowning in a morass of shifting semantics...
muidiri is offline  
Old 11-03-2004, 04:48 AM   #75
Banned
 
Join Date: Oct 2004
Location: Pacific Northwest
Posts: 10,066
Default

Valmont... how do you feel about the statement "I brought you into this world, and I can take you out of it?"
muidiri is offline  
Old 11-03-2004, 04:55 AM   #76
Banned
 
Join Date: Oct 2004
Location: Pacific Northwest
Posts: 10,066
Default

Quote:
Originally Posted by Valmont
For a start, it is not at all clear to me that Shea and Wilson do have an obligation to the computer program, Malik. You have assumed that they do and have used that assumption as the basis for your argument. This is mere question-begging. I do not doubt that Joe Malik could consider Shea and Wilson cruel or even immoral. The question is whether it is reasonable for him to do so.
Why would it be unreasonable for Joe Malik to consider Shea and Wilson immoral?
Do Shea and Wilson have an understanding of morals?
Does Joe Malik have an understanding of morals?
Is Joe's understanding of morals completely different from Shea and Wilson's?
As an objective outsider (neither Joe nor Shea/Wilson), what reason could Shea and Wilson give you, that you find acceptable, that explains why they have no obligation to behave morally toward Joe?
muidiri is offline  
Old 11-03-2004, 10:08 AM   #77
Banned
 
Join Date: Oct 2004
Location: Pacific Northwest
Posts: 10,066
Default

Why isn't this in the Moral Foundations & Principles section? :huh:
muidiri is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 03:44 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.