FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 04-22-2003, 08:18 AM   #61
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Default

HelenM:
...I would have liked to see the site address how we can know that that culture is 'wrong' and we are 'right'. I suppose the author of the site would say that the morality that agrees with the Bible is right....
Well the quote said "Moreover, who or what determines these universal principles?".... with no reply. I guess they are implying that the Bible is the answer to this problem. Much of the text would be based on non-religious psychology textbooks and in those books they would be talking about which humans decide these universal principles (or at least that's how I'd interpret it).

But as I said, I'm trying to understand objective morality as held to by atheists.
Well I don't believe in objective morality, but I like Kohlberg's framework. His theory is usually taken in a non-religious sense - in psychology textbooks, etc. Or if religion is involved, they would relate it to people's individual religions rather than the one "true" religion.

I think my confusion is more to do with the multiple meanings of desire....
Yeah.... well I thought you might like to hear some of my kind-of-related ideas.
excreationist is offline  
Old 04-22-2003, 08:53 AM   #62
Veteran Member
 
Join Date: Mar 2002
Location: 920B Milo Circle Lafayette, CO
Posts: 3,515
Default

Quote:
Originally posted by excreationist
I'm saying that our brain works out how to manipulate the world (and our "imagination") in order to maximize the pleasure signals or pain signals it will receive in the future. Though we aren't normally aware of that... we think we want external things like apples or cars, etc.
Nobody denies that a desire for pleasure and an aversion to pain exist. (Well, there are a few, such as the philosophical duo Paul and Patricia Churchland, but we can skip them for now.)

Your task here would be to show that they are the only thing that exist, that there is nothing missing from such an explanation.

For example, nobody denies that electrons exist. However, if somebody wants to claim that everything in the universe can be explained in terms of electrons, then that person faces a significant challenge. Listing stacks of experiments that make reference to electrons, and even listing a few examples that concern electrons alone and nothing else, are insufficient for this challenge. A single example of something that cannot be explained in terms of electrons alone is enough to defeat the theory.

So, if this account is complete, you must also be able to provide an explanation for the person who throws himself on a grenade to save his friends. You must be able to explain the sociopath, the psychopath, the masochist, and the sadist.

Most importantly, you need to explain to me why those people who do this research, who fully understand this research, are not taking every opportunity to hook themselves up machines that deliver pulses to their pleasure centers. If everything we do aims at maximizing the impulses to these pleasure centers, and these researchers know that this is the way to do it, then they should be doing so.

If given an opportunity to be hooked up to such a machine, would you take it? Let's throw in a package to keep you well fed for the duration of your natural life (even though, ideally, this should not matter, because eating itself has value only insofar as it stimulates this part of the brain, according to this theory.)

If not, then the theory seems to be somewhat insufficient.
Alonzo Fyfe is offline  
Old 04-22-2003, 06:37 PM   #63
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Default

Alonzo Fyfe:
....So, if this account is complete, you must also be able to provide an explanation for the person who throws himself on a grenade to save his friends.
Ok, I'll attempt to explain those things.... (note that I'm not putting large amounts of thought into this, so I might revise my answers later)
"the person who throws himself on a grenade to save his friends"
Well I think our decisions are based on what pleasure/pain we are subconsciously expecting - and we also naturally empathize with people... but I think our empathy of people's future potential pleasure/pain is learnt (babies wouldn't initially be able to do it).
I said that "connectedness" was one of our fundamental drives... it makes us seek unity, familiarity, stability, coherence, etc. People usually develop ideals - like that the lives of others are important - and to seek "connectedness" it involves living out your life close to your ideals. So I think the person is strongly motivated by their ideals - about how things "should" be. If their ideals involve helping others, they will feel better (be emotionally rewarded) when they see the smiles/gratitude of others due to our natural empathy. (even babies empathize with many of the emotions they see in others like happiness or fear)

You must be able to explain the sociopath,
They simply have a misfunctioning empathy system. (I'm making up terms on the fly) Or they might empathize with certain things like animals, but due to being hurt by people in their childhoods they associate people with pain - something that should be avoided (not empathized with).

the psychopath,
At dictionary.com it says this about the word "sociopath": "(`psychopath' was once widely used but has now been superseded by `sociopath') [syn: psychopath]"

the masochist,
They feel pleasure or at least feel willing to subject themselves to painful situations. This pleasure can be sexual or non-sexual. If it's non-sexual, the pleasure could be the "newness" pleasure (thrills) or the "connectedness" pleasure (familiarity) or the "relief" pleasure (after the pain ends). Different people have different "weightings" for different drives (so some people need more newness, etc) and masochists could have inappropriately weighted drives. I haven't read much about them but maybe earlier in life they experience some pleasure (sexual pleasure, etc) while feeling bodily pain or humiliation... I think that we constantly associate all elements of our experience together so those people would feel a little sexual pleasure when they have bodily pain. When they experience both at once there is the added pleasure of familiarity. Note that pleasure isn't necessarily overwhelming... it can involve a subtle feeling of some contentment. Maybe in their childhood they subconsciously learnt that they were hugged/loved after they were punished or got hurt... I don't know... but anyway, I think the theories that psychologists have about it could fit into my framework...

and the sadist.
Sadists get sexual or other pleasure from inflicting physical and emotional pain on others. Maybe they were abused earlier in life and this is their way of getting revenge. Maybe revenge is motivated by the subconscious belief that destroying the source of your problem will end your problem.
I think sadism and revenge involve expressing an aggressive emotion that people feel when they are being competitive or feeling powerful. Maybe it is the "fight" part of the "fight or flight" fear reaction. Maybe the "fight" instinct involves an emotional reward if it is successful and this is why people like fights. Sadism could be a kind of fighting others - except that you are constantly winning. I'm not really sure about those who feel sexual pleasure from sadism... maybe it is due to some quirks in how aggression and sex instincts work. Those earlier explanations are just quick guesses... they are a lot less thought out than my earlier posts (those were based on ideas I've had for a couple years).

Most importantly, you need to explain to me why those people who do this research, who fully understand this research, are not taking every opportunity to hook themselves up machines that deliver pulses to their pleasure centers.
This talks about rat experiments and the possibility of it being done in humans. It has other interesting things to say too.
Anyway, to choose to do that experiment indefinitely involves doing something unfamiliar and giving up a lot of familiar things... this involves a loss of connectedness... (a bit similar to feelings of alienation). I think people doing the experiment would no longer have strong conscious thought... I think conscious thought just involves seeking unmet goals - e.g. seeking newness (relieving boredom), etc. People would normally find the idea of them choosing to lie there indefinitely being conscious but not bothering to think repulsive. For scientists, wanting to wonder about things and explore would be one of their favourite activities. Doing that experiment would involve having an impoverished life - rather than a rich life full of possibilities. So for many reasons they'd find the experiment undesirable overall, compared to their life. Their intellect might tell them that they'd be more happy in the future if they'd do the experiment, but their subconscious reasoning might object for many different reasons. And we are compelled to do what our subconscious reasoning determines what we're going to do.

If everything we do aims at maximizing the impulses to these pleasure centers, and these researchers know that this is the way to do it, then they should be doing so.
I think our goals/plan is formulated in a way to minimize our expected pain and/or maximize our expected pleasure.
e.g. say there is one kind of pleasure and one kind of pain. As I said earlier, I think different emotions can be experienced simultaneously.
Let's say there was a person in a field and there is a distant rose surrounded by wasps. Let's say the expected pleasure from picking and smelling the rose is +50, and being bitten by wasps is -1000, and the expected chance of being bitten is 90%.
So the expected overall emotional outcome for picking and smelling the rose is +50 + (-1000 * 90%) = -850.
I'm not sure about how to work out the alternative... it would probably involve a bit of potential regret, so maybe -50 or less overall (with no pleasure component).
In the first option, the pleasure centre would be giving the strongest reward, but *overall* it isn't the best choice.

If given an opportunity to be hooked up to such a machine, would you take it? Let's throw in a package to keep you well fed for the duration of your natural life (even though, ideally, this should not matter, because eating itself has value only insofar as it stimulates this part of the brain, according to this theory.)

If not, then the theory seems to be somewhat insufficient.

I've only heard about that experiment in rats lasting for a few days... I'm not sure if it would work indefinitely... maybe it would eventually create a shortage of dopamine and there'd be a "crash" or something.... (maybe).
But anyway, assuming that things would go ideally, I'm not sure if I'd want to do it. I have subconscious problems with it, like the ones I explained earlier. But I am depressed or even suicidal a lot so it would make sense to do it. I have been in a good mood lately so I had forgotten about my depression a bit. I'd want to do a lot of things in life first though, and when I think I've milked as much happiness out of the real world I'd be ready... (I might be fairly old by that time). If there was an ultimatum to do it now or never I guess I'd do it... assuming they'd guarantee that it would work properly.... if it could cause brain damage I'd be more hesitant about doing it. I'm used to being hesistant about unfamiliar situations that can involve harm... I'm not hesistant about things like getting a thousand dollars with no strings attached though.
excreationist is offline  
Old 04-23-2003, 04:53 PM   #64
Veteran Member
 
Join Date: Mar 2001
Posts: 2,322
Default

Quote:
Originally posted by Alonzo Fyfe
Your task here would be to show that they are the only thing that exist, that there is nothing missing from such an explanation.
No one here is denying that mental events happen, and it is obvious that specific physical events cause the thinking to occur.

Quote:
For example, nobody denies that electrons exist. However, if somebody wants to claim that everything in the universe can be explained in terms of electrons, then that person faces a significant challenge. Listing stacks of experiments that make reference to electrons, and even listing a few examples that concern electrons alone and nothing else, are insufficient for this challenge. A single example of something that cannot be explained in terms of electrons alone is enough to defeat the theory.


But all we need to show the neural correlate of choice are studies showing that specific deficient physical elements impact choice to the extent to which they are deficient. These we have, and more are accumulating, thanks to rapidly proliferating functional brain imaging studies that concentrate on the neural "underpinnings" of choice, intent, consciousness, etc. I sent you links in another thread, to which you didn't respond.

Think of it this way: let's say you need to choose between two courses of action and your memory banks send up all the information to which you use other memories to project possible ramifications. OK, all that is on the table; now, how do you choose? What is it your brain is actually doing to choose? How does door #2 get chosen? What does "choosing" mean to your brain?

Quote:
Most importantly, you need to explain to me why those people who do this research, who fully understand this research, are not taking every opportunity to hook themselves up machines that deliver pulses to their pleasure centers. If everything we do aims at maximizing the impulses to these pleasure centers, and these researchers know that this is the way to do it, then they should be doing so.


It's because everything they know about life produces a decision-making capacity that rejects such a proposition.

Quote:
If given an opportunity to be hooked up to such a machine, would you take it? Let's throw in a package to keep you well fed for the duration of your natural life (even though, ideally, this should not matter, because eating itself has value only insofar as it stimulates this part of the brain, according to this theory.)

If not, then the theory seems to be somewhat insufficient.
It is not insufficient at all; the alternatives are weighed and that option is rejected.
DRFseven is offline  
Old 04-24-2003, 04:52 AM   #65
Veteran Member
 
Join Date: Mar 2002
Location: 920B Milo Circle Lafayette, CO
Posts: 3,515
Default

excreationist

Can you explain to me why I should not take your post as a simple admission that people aim at something else in addition to stimulation of the pleasure centers of the brain?

If people aim for nothing more than the stimulation of the pleasure centers of the brain, and A provides more stimulation than B, then the fact that somebody picks B shows that, even if stimulating pleasure centers of the brain is one of the criteria of choice, it is not the sole criteria.

If you would opt for experiences in the real world, over being hooked up to a machine that will stimulate the pleasure centers of the brain far more efficiently than experiences in the real world, shows that even if stimulating pleasure centers of the brain is one of the criteria for choice, it is not your sole criteria for choice.


[b]DRFSeven[/i], the same thing applies to you. Where you write It is not insufficient at all; the alternatives are weighed and that option is rejected., it is still the case that the individual opted against the option the provided the strongest stimulation of the pleasure center. Therefore, when he weighed the options, he did NOT do so based on which would stimulate the pleasure centers the most, because ex hypothesi A is the best option.

Where the thesis is that X is the sole criteria of choice, this is tested by looking for options in which A has more X than B. The thesis says that A will be the option of choice (unless the agent is ignorant of the facts, or the agent is insane).

If, instead, B is the option of choice (and Agent is not ignorant of the facts, or insane), then the only logically valid conclusion to draw is that the thesis that "X is the sole criteria of choice" is falsified.

Researchers most heavily involved in this type of research (thus, not ignorant of the fact, nor insane) do not go for hooking themselves up to such machines. X is not the sole criteria for their choices. Thus, the thesis is falsified.
Alonzo Fyfe is offline  
Old 04-24-2003, 06:01 AM   #66
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Default

Alonzo Fyfe:
....Can you explain to me why I should not take your post as a simple admission that people aim at something else in addition to stimulation of the pleasure centers of the brain?....
I'll quote part of my last reply to you:
Quote:
I think our goals/plan is formulated in a way to minimize our expected pain and/or maximize our expected pleasure.
e.g. say there is one kind of pleasure and one kind of pain. As I said earlier, I think different emotions can be experienced simultaneously.
Let's say there was a person in a field and there is a distant rose surrounded by wasps. Let's say the expected pleasure from picking and smelling the rose is +50, and being bitten by wasps is -1000, and the expected chance of being bitten is 90%.
So the expected overall emotional outcome for picking and smelling the rose is +50 + (-1000 * 90%) = -850.
I'm not sure about how to work out the alternative... it would probably involve a bit of potential regret, so maybe -50 or less overall (with no pleasure component).
In the first option, the pleasure centre would be giving the strongest reward, but *overall* it isn't the best choice.
Here's some more examples....
Say there are 3 choices:
a) pain of -100, pleasure of +50 (overall: -50)
b) pain of -500, pleasure of +400 (overall: -100)
c) pain of -10, pleasure of +30 (overall: +20)

I'm saying that we choose so that we maximize the expected overall result... so we choose (c).
(b) has the greatest amount of pain and also the greatest amount of pleasure. But overall it is the least attractive. Our choice isn't always about maximizing pleasure, it can also be about minimizing pain. In (c), pain/discomfort has been minimized, though it also involves the least amount of pleasure.

It is similar to cost and revenue (sales). Business people want to maximize profit and profit = revenue - cost. Maximizing profit isn't always about maximizing revenue - it can also involve minimizing costs.

If people aim for nothing more than the stimulation of the pleasure centers of the brain,
I didn't say that! They also try to minimize the intensity of pain signals coming from the pain center(s?) of the brain. And by pain I also mean things like frustration, guilt (perhaps a form of a lack of connectedness/unity), boredom (lack of newness), etc. But if the pleasure center if stimulated directly so that it is extremely intense, it would outweigh other considerations (guilt, boredom, etc).

and A provides more stimulation than B, then the fact that somebody picks B shows that, even if stimulating pleasure centers of the brain is one of the criteria of choice, it is not the sole criteria.
I said that the pain center(s?) is/are also also involved!

If you would opt for experiences in the real world, over being hooked up to a machine that will stimulate the pleasure centers of the brain far more efficiently than experiences in the real world, shows that even if stimulating pleasure centers of the brain is one of the criteria for choice, it is not your sole criteria for choice.
Pain center(s?) are also involved. But anyway, there is a difference between someone hesitating and resisting something based on what they imagine it to be like and their experience of it. e.g. say there was a substance that a person had a history of being addicted to - like alcohol. If you asked them if they wanted alcohol (and they hadn't had it for a long time), alcohol would be more of an abstract idea, and they would resist. But if you tricked them into drinking it, it could be very easy for them to become an alcoholic again. I think the pleasure center thing would be like that. It mightn't seem deeply appealing (it is very unfamiliar to me after all), but if I experienced it I think I would think - "that was unbelievably fantastic!!!!! More more more more more!!!!" And I might be using the machine and people are saying "remember you were only going to go on it a while?" and I might reply "just a bit longer!!! This is fantastic!!!" But at the moment it seems so abstract. Those desires *seem so hypothetical* rather than being grounded in reality... for me to truly crave something it helps if it seems real to me, and that I've experienced something close to it before. I haven't had any drugs that are supposed to get you really high and have "brain orgasms", etc (I think speed or something is meant to be like that).
When we make decisions we base them on the strengths of associated emotions we are experiencing *at the present* - those emotions aren't necessarily accurate summaries of the emotions we'd experience during hypothetical future events. e.g. say we absolutely love red - and there were two choices of cars - a red car with some fairly severe mechanical problems, and a blue car with no noticeable problems. Rationally, the blue car seems like a better choice. But the person might choose the red car anyway, since this is the only opportunity to buy a car (say they were the only two cars in the world) and since they absolutely love red. Well after a few years the repair bill might run into many thousands of dollars for the red car and the person would be quite upset. If they bought the blue car they'd regret it a bit, but they might put red things all over the inside of the car to cheer themselves up - so the blue car might feel *better* for them after a while - but worse *initially*. (so the red car is actually chosen)
excreationist is offline  
Old 04-24-2003, 07:40 AM   #67
Veteran Member
 
Join Date: Mar 2002
Location: 920B Milo Circle Lafayette, CO
Posts: 3,515
Default

Quote:
Originally posted by excreationist
Alonzo Fyfe:
Here's some more examples....
Say there are 3 choices:
a) pain of -100, pleasure of +50 (overall: -50)
b) pain of -500, pleasure of +400 (overall: -100)
c) pain of -10, pleasure of +30 (overall: +20)
In my example, I was talking about scientists who have sufficient knowledge to create and option "d" (hooking themselves up to machines) where the "overall" is +100. Yet, they do not choose this option.

You yourself have said that if I could provide you with a room where you can comfortably and reliably, automatically, receive jolts to your pleasure center, where all you do is lay there and get jolted, with no pain, automatic feeding, and the like (clearly an "overall = +100" option) you would not take it.

We can add the assumption that the machine is 100% reliable. The chance of pain is 0%. Your body will be fed and the best medical science will keep you experiencing pleasure jolts to your brain far longer, far more reliably, and with far less chance of negative experience than you can hope for in the "real world."

Yet, given this choice, most people (and, I assume, even you) would NOT take the option.

This cannot be explained by stating that the machine option somehow yields less pleasure and more pain than real-world experience. Ex hypothesi we are talking about a case where the machine experience is more pleasurable with no chance of pain or any negative experience (all that is happening is that you are laying in a chair getting jolt to your pleasure center by a machine designed to maximize the pleasure).

Can I sign you up?

If your thesis is correct, this is a no-brainer. There is no reason to even stop and think about it. Nobody -- no sane person -- would have to stop and think about it. The option with the greatest pleasure and least pain is clearly the best option. "Ah, if only we could make such a machine and make it available to everybody, the world would be a perfect place," you should say. We , the whole human race, should be lamenting the absence of such a machine and devoting as much research as possible into developing one, because jolts of pleasure and freedom from pain are the ONLY things that matter."

Absent such a response, we have reason to believe that jolts of pleasure and freedom from pain are not the only things that people value. These are not the only criterion of choice.
Alonzo Fyfe is offline  
Old 04-24-2003, 09:16 AM   #68
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Default

Alonzo Fyfe:
In my example, I was talking about scientists who have sufficient knowledge to create and option "d" (hooking themselves up to machines) where the "overall" is +100. Yet, they do not choose this option.
Actually it would be something like +1000 or more...... but anyway, this "weighing up" is done using your *internal system*. The result may be different to what you rationally think is the best choice. e.g. problem gamblers might realise they have a problem but not be able to stop. I think the pleasure from the thrills they are getting as well as the comfort from the familiar habit is outweighing the guilt they feel. (Though they still feel the guilt as well)
Or a person might have a fear of heights. I think is may be because they had a fall or many falls earlier in life that caused them a significant amount of pain which made them strongly associate the expectation of pain with them being high up. They might be well aware that it is perfectly safe (nearly impossible to lead to future pain) yet they have an intense fear (the subconscious expectation of pain).

You yourself have said that if I could provide you with a room where you can comfortably and reliably, automatically, receive jolts to your pleasure center, where all you do is lay there and get jolted, with no pain, automatic feeding, and the like (clearly an "overall = +100" option) you would not take it.
I said I would hesitate, since I'd be leaving a lot (my normal life, which I've lived for 24 years). Initially we were talking about this being a permanent situation... so I'd hesitate to make sure that I wasn't making a rash decision. The machine might seem to be obvious a good idea at first but may in fact be a mistake. (So I'd think about it carefully) If I was given the opportunity to rule another planet and be rich I would take it, but I'd hesitate as well and probably want to do some things on earth first.
Even if it was supposed to be a temporary experiment, if it is the ultimate "high" with no consequences then it seems that I wouldn't be able to resist once I had begun. (Assuming that I was at the controls)

This cannot be explained by stating that the machine option somehow yields less pleasure and more pain than real-world experience.
What matters is our current *emotional responses* to the options. Not what we think our emotional responses are supposed to be... but according to what they actually *are*. e.g. Say someone has a fear of heights (they have an intuitive expectation of pain) and they are asked to walk across a tight-rope while they're in a harnness. They might say to themselves "there's no reason to be scared!" and vaguely believe it - on a "rational level". But if their actual expectation of pain is too great, their fear will overcome them and they won't be able to do it. There would be some positive emotions that are encouraging them to do it though... they would feel a sense of being normal if they could do it (connectedness).
The harn-ness could be a double-harnness and make it virtually impossible to get hurt.... and it could be above a soft surface so even if they fell they wouldn't get hurt badly. So what they intuitively predict the pain to possibly be is different to the *actual* likely pain. In my last post I also gave the red and blue car example to try and explain this point. The gambler example is also relevant.

.....If your thesis is correct, this is a no-brainer. There is no reason to even stop and think about it. Nobody -- no sane person -- would have to stop and think about it. The option with the greatest pleasure and least pain is clearly the best option.
The thing that weighs up the options is in the brain - I think it processes and compares quantities of chemicals... just reading about something on a page doesn't necessarily immediately cause you to generate the appropriate amount of chemicals next to that choice... especially if the subject matter isn't intimately known to the brain, on a subconscious level.

"Ah, if only we could make such a machine and make it available to everybody, the world would be a perfect place," you should say. We should as a species, be lamenting the absence of such a machine and devoting as much research as possible into developing one, because jolts of pleasure and freedom from pain are the ONLY things that matter.
Well ordinarily we crave things like familiarity (with our past, etc - such as the habit of doing useful things quite often) and newness (exploring/discovering things) so the idea sounds quite repulsive. But I think it would be a different matter if you are in the machine - I think those considerations would no longer seem important. I think many people wouldn't want to use the machine at all.... if they see that it creates "zombies". They might agree that the zombies are extremely happy but they'd say that it isn't natural, and that we're meant to work on earth - and that the true paradise is in the next life. The poor, who make up the majority of the world, wouldn't be able to afford it. And many or most wouldn't use it - for religious or other reasons - and you can't force them to use it (unless they were a small minority and others didn't mind forcing them). There would have to be people maintaining things - or highly intelligent robots. The people left outside might decide to kill off the hedonistic zombies. Anyway, there are lots of problems.

Absent such a response, we have reason to believe that jolts of pleasure and freedom from pain are not the only things that people value.
It is some part of the limb system (or in that area) that uses those chemicals to make the brain's decisions! And when we're talking about this machine hypothetically, the amount of pleasure chemicals that represents the possibility of using the machine isn't at an appropriate level... there is only a weak association between the words I'm reading that describe this machine and feelings of expected pleasure. To have a strong association I'd have to use the machine and then being reminded of the machine would trigger strong emotions. Before using the machine, they are only imagining, in a vague way, what lots and lots of pleasure might be like. This is different from what the actual pleasure would be. (Like the fear of heights example)

These are not the only criterion of choice.
What else do you think is involved with choice? Are only values involved? I think that all of our values are based on our fundamental drives (like seeking newness, connectedness [with a worldview/community, etc], relief/relaxation and sucking/chewing - and avoiding frustation and bodily pain, etc)
excreationist is offline  
Old 04-24-2003, 02:36 PM   #69
Veteran Member
 
Join Date: Mar 2002
Location: 920B Milo Circle Lafayette, CO
Posts: 3,515
Default

Excreationist:

In the first part of your response you seem to be arguing that the whole (or vast bulk) of humanity which would not want to be hooked up to such a machine suffers from some sort of mass insanity or irrationality, akin to a fear of heights, which prevents its members from recognizing the true value of sitting as a zombie with electrodes stuck into its brain obtaining jolts to the pleasure center of the brain.

Ultimately, I am interested why you think a theory that requires that postulating a mass delusion in order to explain human decision making is to be preferred to a theory that postulates criteria of choice in addition to pleasure and avoidance of pain. If we can have two criteria of choice, why not three -- particularly where two fail to fully account for the choices that people make?


Later in your response, instead of using an 'irrational impulse' hypothesis, you offer a 'rational impulse against unaccounted for pains' hypothesis. The problem with this answer is that if, indeed, the reason to refuse such an option is because of unmentioned costs, motivation would still be directed toward removing the costs accompanied by regret if those costs cannot be removed. Think, for example, of how you would react to a promises you a substantial prize if you could win a contest. Inability to win the contest elicits regret. But people do not seem to have much regret over the unavailability of such machines. At least, I do not sense any such regret on my part, nor on the part of anybody else I talk to on this matter.


Still later, you would use a third response, a "try it, you'll like it' response. Here, I wish to offer that it is more plausible to say that the experience changes your desires, by altering the structure of the brain, rather than capturing the realization of an existing desire. That is to say, the experience is an acquired taste (like the taste for beer or coffee, acquired because of the positive experience of the drugs contained within these drinks).


In answer to your question, What else do you think is involved with choice? Are only values involved?....

The most popular theory in the field today is BDI theory. In the vast majority of my writings, this is the theory that I present and defend, simply on the grounds that I should not argue against the experts in the field unless I can claim to know more than they do.

BDI theory holds that desire provides all of the motivational force for human actions, but that desire can take any number of objects (not limited to 'jolts of pleasure' and 'avoidance of pain').

One of the arguments in favor of BDI theory (and against pleasure/pain theory) concerns evolutionary complexity. A decision matrix (for an antelope, for example) that goes straight from "I see a lion" to "RUN!" is simply far simpler and more evolutionarily practical than one that goes "Lion -> Lions are associated with pain -> pain is bad -> pain can be avoided by running -> RUN!"

Evolution would favor having antelopes that run from lions BEFORE learning that lions are associated with pain, rather than after.

On these grounds, it is argued, we can expect our brains to be wired for instant response to certain type of stimulae, without going through any sort of cognitive processing which can do little more than slow down reactons and get one eaten.

This is not to say that pleasure and pain do not play an important role among our desires. It argues that pleasure and pain do not play an EXCLUSIVE role, that the brain contains other programming as well.
Alonzo Fyfe is offline  
Old 04-24-2003, 07:35 PM   #70
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Default

Alonzo Fyfe:
I think you're misunderstanding me. A lot of the problem would be from me only gradually revealling my ideas or that I sometimes can't think of the whole framework at the time. Maybe I'm mixed up a lot of the time.

In the first part of your response you seem to be arguing that the whole (or vast bulk) of humanity which would not want to be hooked up to such a machine suffers from some sort of mass insanity or irrationality, akin to a fear of heights, which prevents its members from recognizing the true value of sitting as a zombie with electrodes stuck into its brain obtaining jolts to the
pleasure center of the brain.

They haven't experienced it so they can't truly imagine how pleasureable it is!!! Making decisions involves automatically/subconsciously imagining how pleasureable (or painful) possible events would be. And this estimation is based on past *experience*. e.g. say our friend shows us an unusual food and tells us that it tastes good. If taking the friend's advice in the past had led to increased pleasures and/or decreased pains (including the pain of frustration), we'd *intuitively* (automatically) believe that the friend is probably telling the truth - that the food is tasty - and in past experience, we know that tasty foods are pleasureable. There would be little negative side to the decision - there would be a possibility that the food is poison or revolting, but based on the accumulation of life experience (learnt subconsciously) we'd subconsciously believe that is unlikely so trying the food seems like a good choice. If we had had that food before we'd hesitate (weigh up options) much less since we'd have a firmer belief that the food tastes good. (A stronger association between the food and future pleasure)
I think the fear of heights is actually sort of rational.... maybe people who have a severe fear of heights only have had a very limited amount of experience with heights, and amongst that small amount of experience, a very significant amount of it involves being hurt - or maybe seeing or hearing about others being hurt. So a large proportion of that experience involves them getting hurt (it could be when they were a toddler or a child). So *intuitively* for them it seems that there is quite a high chance of them being hurt even though they logically might agree that they would be safe in a harnness. And what they intuitively determine the likely outcome to be is what our brain acts on.

Ultimately, I am interested why you think a theory that requires that postulating a mass delusion in order to explain human decision making
We crave familiarity!!! (Connectedness) I've said that many times. That would make people apprehensive (going on the machine would result in less connectedness with their beloved past life). Also due to a lack of similar experiences to base an prediction on, their brain wouldn't estimate a very strong amount of pleasure... if a person had had drugs that gave them brain orgasms(?) (maybe speed? or something) and were told that the pleasure machine is like that, they would link the experience of drugs with their intuitive prediction of what the machine's pleasure would be like... their brain would basically think that the machine has that same amount of pleasure (as the drug) - or perhaps greater. If a person hadn't had the drug, the amount of pleasure would be abstract to the person. The person wouldn't be able to properly imagine it.

is to be preferred to a theory that postulates criteria of choice in addition to pleasure and avoidance of pain.
I'm saying that maximizing *intuitively predicted* pleasure and minimizing *intuitively predicted* pain is the basis for choice. I think we choose what emotionally *seems* most appealing. I'm saying we learn to associate situations and other things with emotions (so we have a criteria of choice).

If we can have two criteria of choice, why not three -- particularly where two fail to fully account for the choices that people make?
I don't think you understand my theory yet. Note that I mightn't have explained it properly earlier on. (it is about intuitively expected pleasures and pains rather than *actual* future pleasures and pains)

Later in your response, instead of using an 'irrational impulse' hypothesis, you offer a 'rational impulse against unaccounted for pains' hypothesis. The problem with this answer is that if, indeed, the reason to refuse such an option is because of unmentioned costs, motivation would still be directed toward removing the costs accompanied by regret if those costs cannot be removed. Think, for example, of how you would react to a promises you a substantial prize if you could win a contest. Inability to win the contest elicits regret. But people do not seem to have much regret over the unavailability of such machines. At least, I do not sense any such regret on my part, nor on the part of anybody else I talk to on this matter.
Consider the "ruler of another planet" example I gave in my last post. Using the machine forever is similar to that example since it involves leaving our present earthly life. Being the ruler of another planet would be very attractive to me, but still, I'd want to do some more things on earth before I left. I'm saying we crave the familiarity of our pasts (connectedness). I'd also hesitate about committing because I have a habit (I think habits involve seeking connectedness/familiarity) of thinking big decisions over carefully. It also seems logical to think carefully about important things and being logical seems like a good way of getting what I want. (I probably learnt in the past that lesson subconsciously)
In your example of a contest where you can win a big prize, *none* of your loved things in the future are negatively affected. The prize is just a future addition to the future things you would love. But in the machine and ruler of another planet example, the things you presently love would be taken away from you!
I think we'd only truly regret things if our perception of things change - e.g. if we discover that we would have won that contest (that we didn't enter) we'd feel regret. Or if we decided to not be a ruler of another planet for some reason and later our life became really depressing (we could be falsely imprisoned for life, etc) - we might change our mind due to the new circumstances. I guess in the wasps example there would be no regret really - just a frustrated desire for the rose. There would be a desire for the rose - but it would be frustrated due to the overwhelming assessment that it would be a bad idea to pick and smell the rose (due to the wasps). So do you have *no* desire at all to have the pleasure centre of your brain stimulated? The part you perceive could be good (a little stimulation on the machine) is like the rose - and the parts you perceive to be bad (the loss of your lifestyle, having to be unproductive) are like the wasps. And I think the perceived negative aspects are outweighing the perceived positive aspects. So is *anything* about the idea appealling? (having *some* extra pleasure via the machine)
When the decision is very close then we can have regret too since our perception of likely outcomes could be oscillating between one outcome seeming more appealling that the other.
So you wouldn't feel regret, but perhaps a frustrated desire for an aspect of it (the promise of some more pleasure).

Still later, you would use a third response, a "try it, you'll like it' response. Here, I wish to offer that it is more plausible to say that the experience changes your desires, by altering the structure of the brain, rather than capturing the realization of an existing desire. That is to say, the experience is an acquired taste (like the taste for beer or coffee, acquired because of the positive experience of the drugs contained within these drinks).
Yes, the positive *experience* of those drugs.... i.e. "try it, you'll like it".

BDI theory holds that desire provides all of the motivational force for human actions,
I agree.

but that desire can take any number of objects (not limited to 'jolts of pleasure' and 'avoidance of pain').
Pleasure isn't necessarily in jolts! I've been saying that I think there are many types... some would act faster than others... e.g. I think the warm-fuzzy feeling of deep familiarity is slower acting than deep relaxation/relief, sucking/chewing, sweet tastes, etc.
And by pain I'm including things like frustration.

On these grounds, it is argued, we can expect our brains to be wired for instant response to certain type of stimulae, without going through any sort of cognitive processing which can do little more than slow down reactons and get one eaten.
So you mean things like inhibition can be involved - to over-ride instinctual behaviours.

This is not to say that pleasure and pain do not play an important role among our desires. It argues that pleasure and pain do not play an EXCLUSIVE role, that the brain contains other programming as well.
Yeah, I agree it contains other programming, but humans only have a few reflexes/instincts.
About what you're saying - off hand, I think the automatic blinking of our eyes doesn't involve the seeking of pleasure or avoiding pain. But I think consciously controlling it involves some pain/discomfort being felt - I think all effort results in some feelings of pain/discomfort - though they might be hardly noticeable. The same goes for breathing... I think it is usually an automatic response, but we can control it. I think when we control it we are feeling some discomfort/pain due to effort... I think controlling it involves consciously overwhelming the motor part of our brain that is responsible for controlling it. Well that is off the top of my head based on a rough idea of how I think the brain works.
excreationist is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 04:24 PM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.