Freethought & Rationalism ArchiveThe archives are read only. |
12-13-2002, 03:57 PM | #11 |
Senior Member
Join Date: May 2002
Location: New Jersey, USA
Posts: 545
|
I would concede that programming a very select set of specific behaviors would be beneficial. I think you'd run into various kinds of problems if you tried to apply this to a large set of issues.
What do you do if/when the programming fails or is not present? If we're programmed not to be violent, what happens when we come into contact with someone without such programming? Who decides if something is good or bad? You should have near universal consensus that child rape is bad. What about lying? Again, most people will agree that lying is bad. But is it always bad? Would we be willing to give up lying forever? Our sense of right and wrong changes - both as a society and as individuals. Well, at least that's what I claim; I'm a relativist. There is no one accepted moral system; at most there is agreement on specific behaviors. I think that's as far as you could go. |
12-13-2002, 06:07 PM | #12 |
Junior Member
Join Date: Oct 2002
Location: N 47° 11’ 14”, W 122° 10’ 08”
Posts: 82
|
Greetings, Earthlings.
It IS undesirable to be forced IN ANY WAY to do ANYTHING. Even in the name of something good. 'Good' is a VERY subjective term, too. Good for whom? when? good intentions? (BTW, I am quite fond of the phrase 'Good intention is the pathway to Hell.") IMHO, I think that it is much better to have people aware and educated about morals and ethics, so that they may act in the best way possible. Just due to the fact that one is programmed to do soomething, it is already bad. Programming people to do good things also sounds like some typical dystopia plot. |
12-16-2002, 04:55 AM | #13 |
Veteran Member
Join Date: Oct 2001
Location: U.S.
Posts: 2,565
|
Again, I think some of you are missing my point. Admittedly, this is due to the ambiguos wording I've used.
I'm not talking about programming existing people. I'm talking about "programming" on a basic level, that is a fundamental aspect of your personality which was created with a specific intent rather than allowed to form chaotically through nature and nurture. Evolution has "programmed" most of us not to want to kill ourselves. If human beings in general were programmed to have a different set of basic drives, responses, and emotions, and that resulted in less evil committed against other humans, would this be bad? Jamie |
12-16-2002, 09:01 AM | #14 |
Veteran Member
Join Date: Jul 2002
Location: Overland Park, Kansas
Posts: 1,336
|
Jamie:
There is a difference between having a desire not to do something, and being unable to even conceive of doing it. I don't mind having 'preferences'. I would not want to live without (at least the illusion of) being able to consider and weigh all my options. Keith. |
12-16-2002, 12:09 PM | #15 |
Junior Member
Join Date: Oct 2002
Location: N 47° 11’ 14”, W 122° 10’ 08”
Posts: 82
|
I think the problem is a definition of the word "programming."
A program is (at least the kind of program we're talking about): 6. A set of coded instructions that enables a machine, especially a computer, to perform a desired sequence of operations. That means you have to have a.) coded instructions of some kind (maybe genes or memes) and b.) a desired sequence of operation. |
12-17-2002, 12:27 AM | #16 |
Junior Member
Join Date: Nov 2002
Location: Belgium
Posts: 75
|
There's no problem at all with that word, it's perfect... In that sense that both nature and nurture are forms of programming. Our genes are the program that our cells
use to replicate themselves and recreate the body. Our education and nurture progams our thoughts to a certain level of functionality. A good example of how education is also programming is this: I must wash my hands after going to the toilet. By the words and examples of my parents, I was programmed to perform the act of hand washing after the set condition was met. If I cannot do as I was programmed, then I feel very unpleasant, and will stll wash my hands later on. In essence, even forceful ways of programming people, like brainwashing are in essence also form of education. However, you cannot use violence to teach people nonviolence. Violence is in essence forcing your will upon others, completely disrespectig their will. If you force or forcefully teach others to be nonviolent, they will realise your hipocrisy and ultimately reject you and your teachings. All people have the will to power, but also the will to cooperate. To program a person correctly, you must not try to oppose their will to power, but try to use their will to cooperation and enforce their sense of power. We learn, so we can archieve more. The long term benefits of ethical behaviour are what should be the stimulus for encouraging and teaching ethical behaviour. |
12-17-2002, 07:26 AM | #17 | |
Veteran Member
Join Date: May 2001
Location: NW Florida, USA
Posts: 1,279
|
Jamie_L,
Quote:
By limiting the potential for evil, I suspect you would also limit the potential for good. For example, a knife can be used for both good or evil. From this point of view, I guess the answer to your question comes down to your feelings on liberty. Would the world be better off without knives? Now instead of a knife, let's consider human ambition. In some cases, ambition leads to evil actions, while in others, it drives people to greatness. Would the world be better off if human nature did not include ambition? I'm inclined to say no. However, you might be talking about a programming which does limit potential, but only addresses the expression of that potential. That is, the knife is still around, but it is only used for cutting steak. No one would consider using it for cutting another human being. While this is a preferable situation, I'm not sure it would be good to impose it via programming. Again, the knife cuts both ways. It seems to me that everything which contains the potential for evil also contains an equivalent potential for good. Whatever your programming constrains will limit some potential for good. If you limit the freedom to think in a particular way, you will have destroyed the good that can come from the freedom to think in that manner. I believe a program that limits the potential of human thought would limit the good, and hence would be a bad thing. I guess the controversial claim is that the potentials for good and evil are connected. I don't have any good arguments for this claim. It just seems to me that there are many examples of its truth. |
|
12-17-2002, 11:40 AM | #18 |
Junior Member
Join Date: Oct 2002
Location: N 47° 11’ 14”, W 122° 10’ 08”
Posts: 82
|
Borean:
I agree with you on that level, that our sociological and genetic behaviours are passed on from generation to generation. Albeit, 'programming' in this fashion is not terribly active. It is a sort of passive system that we can only partially control. We can't really choose exactly which genes stay / leave when we bear children (at least not yet) and it is very hard, (I'd argue impossible) to hide all of our behaviours, or memes, that are picked up by our children out of the womb. This is the face of evolution, and 'programming' in this sense is not bad at all (in fact, it is quite desirable) Programming people to think in a certain way, programming people not to think completely free, although for a good intention, necessarily limits man. Think of the NAZI effirts of eugenics and genocide - a form of programming and control. On the other hand, Genetic Engineering that programmes man to be resistant to viral strains, or to be stronger, or to have greater mental facility is a good thing. I guess it is not so much the programming itself, but how it is carried out, and what its actual results are. Intention is meaningless, and should not be a basis for a moral standing. |
12-17-2002, 12:40 PM | #19 |
Contributor
Join Date: Jul 2000
Location: Lebanon, OR, USA
Posts: 16,829
|
Except that knives have handles, cases, and stuff like that for keeping them from causing trouble to their users.
Dr. Isaac Asimov had addressed that question in his robot science-fiction stories. His robots are programmed with the "Three Laws of Robotics", which state <ol type="1">[*]A robot may not injure a human being, or, through inaction, allow a human being to come to harm.[*]A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.[*]A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.[/list=a] In effect, robots will be programmed to be very virtuous. These laws, I may add, can be interpreted as Three Laws of Tool Design, an interpretation thought of by someone else that Asimov approvingly once mentioned. [ December 17, 2002: Message edited by: lpetrich ]</p> |
12-17-2002, 04:42 PM | #20 |
Junior Member
Join Date: Nov 2002
Location: england
Posts: 51
|
Ah but he forgot the most important rule:
4. A robot must not change it's rules. If a robot was truely intelligent couldn't they override these rules anyway in the same way that we can override pain? Kryten overrode his rule not to lie for example |
Thread Tools | Search this Thread |
|