FRDB Archives

Freethought & Rationalism Archive

The archives are read only.


Go Back   FRDB Archives > Archives > IIDB ARCHIVE: 200X-2003, PD 2007 > IIDB Philosophical Forums (PRIOR TO JUN-2003)
Welcome, Peter Kirby.
You last visited: Yesterday at 05:55 AM

 
 
Thread Tools Search this Thread
Old 01-19-2002, 01:20 PM   #41
Banned
 
Join Date: Jul 2001
Location: South CA
Posts: 222
Post

[QUOTE]Originally posted by excreationist:
I'd get more RAM (short term memory or working memory) - I think that is the most important thing. I don't need any more long term storage though. My brain could be overclocked from something like 20-40 cycles per second to 1000+ Hz. It would be like upgrading parts of a computer. [QUOTE]
Yeah just like a computer, hehe. Say you're considering upgrading your brain/body, installing a new one into your habitat, and selling your old brain/body as a discount combo deal. (It is "used", after all.) There must be all kinds of long-term data on your brain you are not using. You wouldn't need all of that data in your new body/brain, correct?

We could probably improve on the neurological structure of your brain, as well. I suppose you would you like this structure to be optimized for your particular use. Now we just have to figure out what you use your brain for. I gather it has something to do with pleasure. How about if your new brain/body combo is much happier than the old one?
[QUOTE]Originally posted by excreationist:
quote:
--------------------
Originally posted by hedonologist:
...What if you wanted to get rid of cancer, so you had a whole exact duplicate body (with brain) made with the agreement that after you met the new body and checked to make sure it was working properly, you would take something to die. Would "you" "wake up" as the new body?
------------------

Well the "you" who had experiences after the duplication wouldn't be in the body, but the one immediately before the duplication (who decided to be duplicated, etc) would be in the body.

quote:
--------------------
Originally posted by hedonologist:
Would you use this method to get rid of cancer?
------------------

I guess... mainly so that I could meet myself. [QUOTE]
Notice, you are referring to an "I" who would be having a different experience from who you are calling "myself". I define a person as an experiencer, so if a being is having a different experience, they are a different person. "I" can't have a different experience from "myself".
[QUOTE]Originally posted by excreationist:
Well assuming that the person is unconscious during the copying, I think there aren't any problems - the original is copied while being unconscious and then killed. The copy is revived. But if the original is conscious during or after the copying at all, then there are problems - one version of that person knows that they will die. (At least during the copying, the only one who wakes up knows that they will be alive) [QUOTE]
Is there any problem with someone slipping you some drugs that will first knock you out cold (as far as we can see) and then kill you, assuming there were no chance of this not going smoothly? You apparently wouldn't *know* what happened.

Materialists and dualists could have some serious symbiosis, if we just had the technology. I'm so sentimentally attached to certain brain functions of my current brain that I would sell it real cheap with the understanding that it would be installed in a better body and allowed to do what it wants. But my old brain would be your brain, once it was installed in your body, so it would be doing things the "new you" wanted to do, not things "I" wanted. And guess what? You could make the payment to your own new brain! You just take the money out of your account and put it right back. And that is just the beginning. You would get to keep all of my: material possessions, money, even any friends who don't particularly mind that my old brain was someone else! Before your old brain complains, you and me could find some way to make sure it didn't know what was going on. hehe

What have you got to loose?

[ January 19, 2002: Message edited by: hedonologist ]</p>
hedonologist is offline  
Old 01-19-2002, 01:28 PM   #42
Veteran Member
 
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
Post

Quote:
Originally posted by hedonologist:
[QB] I'm trying to figure out what that means in terms of it implications regarding your behavior. QB]
It really didnt mean anything: I was commenting on the useage.

But there are a couple related points:

1) Consciousness is a causal product of the brain.

If, for example, my brain were slowly replaced by parts from another, identical brain, my consciousness would remain intact. Its a fact that my brain has the causal powers capable of producing mental events. The same could possibly hold true with other materials; computer chips, for example. Assuming they have the same sort of causal properties, they too must nessecarly produce this particular consciousness....*my* consciousness.

2) Related to the first, consciousness has a first-person ontology. Its mode of existance is subjective.

Now suppose we leave this particular brain alone, and construct another totally indentical one, and place it in someones body. Does it follow that *I* am that other person? Im fairly sure it does not. My consciousness is subjective, it is, after all, *my* consciousness. The other fellow, while presumably having the same content of thought, emotion, what have you....is still, quite clearly, not *me*.

For example, suppose we take two glasses of water. The macro-property we call "liquidity" is a causal result of the micro-structure. If both have identical micro-stuctures, and if we assume causation is nessecary, they must nessecarly have the same macro properties. But it would be absurd to suggest that, because they have identical macro properties, they are *the same thing*. Glass X is not glass Y, by definition.... identical as it may be.
God Fearing Atheist is offline  
Old 01-19-2002, 02:18 PM   #43
Banned
 
Join Date: Jul 2001
Location: South CA
Posts: 222
Post

Quote:
Originally posted by God Fearing Atheist:
It really didnt mean anything: I was commenting on the useage.
OK, but I would still like to know how being a non-dualist would have any practical implication as far as behavior or something "material".
Quote:
Originally posted by God Fearing Atheist:
If, for example, my brain were slowly replaced by parts from another, identical brain, my consciousness would remain intact. Its a fact that my brain has the causal powers capable of producing mental events. The same could possibly hold true with other materials; computer chips, for example. Assuming they have the same sort of causal properties, they too must nessecarly produce this particular consciousness....*my* consciousness.
If you had unlimited technological powers, could you test to see whether computer parts produce "this particular consciousness"? If so, how?
Quote:
Originally posted by God Fearing Atheist:
Now suppose we leave this particular brain alone, and construct another totally indentical one, and place it in someones body. Does it follow that *I* am that other person? Im fairly sure it does not. My consciousness is subjective, it is, after all, *my* consciousness. The other fellow, while presumably having the same content of thought, emotion, what have you....is still, quite clearly, not *me*.
What if the brain were exchanged with your current brain?
Quote:
Originally posted by God Fearing Atheist:
For example, suppose we take two glasses of water. The macro-property we call "liquidity" is a causal result of the micro-structure. If both have identical micro-stuctures, and if we assume causation is nessecary, they must nessecarly have the same macro properties. But it would be absurd to suggest that, because they have identical macro properties, they are *the same thing*. Glass X is not glass Y, by definition.... identical as it may be.
Yes, but I would trade one glass of water for another, so my value for each is the same. Notice that some of the brains which are causing the patterns of letters you see before you, are concluding that copies of themselves are just as "valuable" to them, as "themselves" (in certain situations) because they think the copies "are" themselves.

So I am trying to translate phrases like "*the same thing*" or "quite clearly, not *me*", to their implications in terms of behavior.
hedonologist is offline  
Old 01-19-2002, 09:48 PM   #44
Veteran Member
 
Join Date: Sep 2000
Location: Massachusetts, USA -- Let's Go Red Sox!
Posts: 1,500
Post

As to the first point, i dont believe there are any. Its just a simple ontological distinction that needs to be drawn. One can hold a belief in consciousness and maintain a "secular worldview", or at least believe consciousness isent some magical "ooze".

As to the second, yes, i think you could, and could go about it in the way same sort of way i described. Slowly replace existing neurons (or whatever) with a mechanical substitute, and report on the effects.

Finally, i really dont think this issue has any bearing at all on ethical or behavorial matters. People have no intrinsic value....it is ascribed to them, or sometimes not at all....me, clone me, or me with a clone brain. None of these things affect what we ought not do (i attempted to work this out in my rather long-winded post on contractarianism).
God Fearing Atheist is offline  
Old 01-19-2002, 11:39 PM   #45
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

[quote]Originally posted by hedonologist:
<strong>
Quote:
Yeah just like a computer, hehe. Say you're considering upgrading your brain/body, installing a new one into your habitat, and selling your old brain/body as a discount combo deal. (It is "used", after all.) There must be all kinds of long-term data on your brain you are not using. You wouldn't need all of that data in your new body/brain, correct?</strong>
Of course I'd need it. Just about every bit of trivial information probably would be used again one day.

Quote:
<strong>We could probably improve on the neurological structure of your brain, as well.</strong>
Well mine is a bit unreliable but on the other hand that is probably the cause of a lot of my creativity. So I might leave it the way it is. But make sure the artificial neurons never die - they can stop working - I think that's called Alzheimers'.

Quote:
<strong>I suppose you would you like this structure to be optimized for your particular use.</strong>
That would be too complicated - the 100 billion neurons are each connected to about 10,000 others. I think dreaming does that anyway, also whenever you repeat something, the neural pathway is strengthened I think. (So that you can do things a lot faster)

Quote:
<strong>Now we just have to figure out what you use your brain for. I gather it has something to do with pleasure. How about if your new brain/body combo is much happier than the old one?</strong>
I might leave pleasure fairly much alone and be able to control the pain from physical injury. If I tampered with my motivational system (pleasure/pain system) too much then I would be like <a href="http://208.245.156.153/archive/output.cfm?ID=91" target="_blank">this</a> man who lost a lot of subtle intuitions that I think are a result of carefully weighing up emotional responses that have been learnt over a lifetime.
I'd also fix my body while I was at it - I'd look like a guy that girls like the look of and hopefully not be very hairy.

Quote:
<strong>Notice, you are referring to an "I" who would be having a different experience from who you are calling "myself". I define a person as an experiencer, so if a being is having a different experience, they are a different person. "I" can't have a different experience from "myself".</strong>
Ok, so there is a copy who has my personality and behavioural patterns. I would be interested in meeting that person.

Quote:
<strong>Is there any problem with someone slipping you some drugs that will first knock you out cold (as far as we can see) and then kill you, assuming there were no chance of this not going smoothly? You apparently wouldn't *know* what happened.</strong>
That would probably be ok, but I'd want to take out life insurance then because if there is no copy then my personality is dead.

Quote:
<strong>Materialists and dualists could have some serious symbiosis, if we just had the technology. I'm so sentimentally attached to certain brain functions of my current brain that I would sell it real cheap with the understanding that it would be installed in a better body and allowed to do what it wants. But my old brain would be your brain, once it was installed in your body, so it would be doing things the "new you" wanted to do, not things "I" wanted.</strong>
No, that brain is still you, except that you've possessed a different body. Initially that brain would have an identical personality to the old you, but over time, its perception of itself because of its body would probably shape that personality so it becomes a different person. But I think technically the same person is involved. e.g. say someone was a strict fundy and then became a promiscuous hippy? In a way they are different people, but technically I think they are the same person. (They basically have the same memories, they've shared a common history, etc)

Quote:
<strong>And guess what? You could make the payment to your own new brain! You just take the money out of your account and put it right back.</strong>
Yes it is my body's new brain, but it is your brain - the one that contains your personality and memories. I would still have a brain then. I don't own that new brain - if I did then I wouldn't need to pay it anything.

Quote:
<strong>And that is just the beginning. You would get to keep all of my: material possessions, money, even any friends who don't particularly mind that my old brain was someone else! Before your old brain complains, you and me could find some way to make sure it didn't know what was going on. hehe</strong>
Well for my old brain to make a withdrawal it would have to be conscious... unless you're saying that you would just use my body to "impersonate" me to get my money. Well your brain doesn't have the memories for my signature or my pin number...

Quote:
<strong>What have you got to loose?</strong>
Well I am that old brain - it depends what you'd do with me. If you threw that brain in the garbage then I lost my life - it is as if I donated my entire body to someone else.
excreationist is offline  
Old 01-19-2002, 11:45 PM   #46
Banned
 
Join Date: Jul 2001
Location: South CA
Posts: 222
Post

Quote:
Originally posted by God Fearing Atheist:
As to the second, yes, i think you could, and could go about it in the way same sort of way i described. Slowly replace existing neurons (or whatever) with a mechanical substitute, and report on the effects.
Say you had that done and then they put all those neurons back together in a body that was also an exact copy of yours. You would have an organic person who feels much like you do now, with your memories, etc, looking at a person with your old body with a computer for a brain. The contract states that one of these "people" must die or be shut off. If you (as you are now) could make this decision, whom would you rather have live?
Quote:
Originally posted by God Fearing Atheist:
Finally, i really dont think this issue has any bearing at all on ethical or behavorial matters. People have no intrinsic value....it is ascribed to them, or sometimes not at all....me, clone me, or me with a clone brain. None of these things affect what we ought not do (i attempted to work this out in my rather long-winded post on contractarianism).
The way we choose to behave is not necessary based on what someone calls an "intrinsic value", but it *is* based on what we value. I would think that if someone (ie some brain) considered something to be their "self" (ie a physical correlate of their "self"), they would be interested in protecting it, etc. I'm not asking about an "intrinsic value", I'm asking about the value *you* place on various ways of arranging the matter called "your brain" or copies of it, etc.
hedonologist is offline  
Old 01-20-2002, 01:43 AM   #47
Banned
 
Join Date: Jul 2001
Location: South CA
Posts: 222
Post

I put this next statement on top because this is what I am thinking is the sort of behavior a "materialist" philosophy can lead to. When I say "materialist" philosophy I'm speaking of the "info matters" type materialist, that tronvillain mentioned.
Quote:
Originally posted by excreationist:
quote:
---------------
Hed: Is there any problem with someone slipping you some drugs that will first knock you out cold (as far as we can see) and then kill you, assuming there were no chance of this not going smoothly? You apparently wouldn't *know* what happened.
-----------------------

That would probably be ok, but I'd want to take out life insurance then because if there is no copy then my personality is dead.
Intriguing. I must admit I wasn't expecting you to say that.

How about if someone had some way of getting into your brain and fiddling with your desires so that you felt that you wanted to be their slave? You would have no *knowledge* that they were doing this. For all you know you just enjoyed nothing more than to clean their house, file their bunions, etc.
Quote:
Originally posted by excreationist:
Of course I'd need it. Just about every bit of trivial information probably would be used again one day.
Do you never delete anything from your hard drive?
Quote:
Originally posted by excreationist:
Well mine is a bit unreliable but on the other hand that is probably the cause of a lot of my creativity. So I might leave it the way it is.
Unreliability the cause of creativity? I'm finding this excuse to keep your old brain, dissatisfactory. Do you think you are the most creative person who has ever been? We got Einstein's brain copy here, Leonardo De Vinci, etc, all at barging rates. We can take creativity from one and combine it with lighting fast clerical abilities, etc. Certainly you wouldn't rather keep that old thing. Show me a creativity test (or any other type of test) that your current brain can pass, which another brain could not do better.
Quote:
Originally posted by excreationist:
That would be too complicated - the 100 billion neurons are each connected to about 10,000 others. I think dreaming does that anyway, also whenever you repeat something, the neural pathway is strengthened I think. (So that you can do things a lot faster)
It is a little too complicated currently, in the real world, but I'm speaking hypothetically to make a point. Say we could map and rebuild your whole brain down to beneath the level of a sub-atomic particle, knowing whatever may cause quantum mechanic fluctuations, etc.
Quote:
Originally posted by excreationist:
I might leave pleasure fairly much alone and be able to control the pain from physical injury. If I tampered with my motivational system (pleasure/pain system) too much then I would be like this man who lost a lot of subtle intuitions that I think are a result of carefully weighing up emotional responses that have been learnt over a lifetime.
Why not just scrap the whole "motivational system" along with the rest of the brain and start over? You are thinking small, like replace a part here and there. I'm saying, "We have no limit of money or technology so why not get a whole new 'computer'?"

So maybe some of the data will be stored in a different format-- the new brain could perform any *objective* function the old one did, and more. Aren't we better off to judge whether or not to upgrade our computer by using some objective sort of benchmark?
Quote:
Originally posted by excreationist:
Ok, so there is a copy who has my personality and behavioural patterns. I would be interested in meeting that person.
So would I. I just wouldn't die for them (if I had a choice) and I wouldn't advocate that someone kill me even if I didn't know I would die.
Quote:
Originally posted by excreationist:
Yes it is my body's new brain, but it is your brain - the one that contains your personality and memories.
Wouldn't they become "your" memories and personality, after having purchased them? What function do the personality and memories serve, in your computer/brain?
Quote:
Originally posted by excreationist:
I would still have a brain then. I don't own that new brain - if I did then I wouldn't need to pay it anything.
Well you would know that you are not paying it anything; your new brain would just think you were paying it. Who cares what it thinks so long as it does what it is supposed to? That is the beauty of it. This stupid dualist brain thinks it is getting paid when in reality, *you* are. So everybody's happy.
Quote:
Originally posted by excreationist:
Well for my old brain to make a withdrawal it would have to be conscious... unless you're saying that you would just use my body to "impersonate" me to get my money. Well your brain doesn't have the memories for my signature or my pin number...
I was assuming that if we did this, your current brain would not become disagreeable unless it thought the new brain would NOT become "you" after it were purchased and installed, to perform the *objective* functions of the old brain. I'm having difficulty figuring out what exactly you are using your computer/brain for. Maybe you could explain what sort of functions you would like to have in a brain and we can set you up with a brain that performs these functions better than any other.
Quote:
Originally posted by excreationist:
Well I am that old brain - it depends what you'd do with me. If you threw that brain in the garbage then I lost my life - it is as if I donated my entire body to someone else.
But if we made a copy of your brain then the brain in the garbage somehow becomes not "you" anymore? The amount you value the brain in the garbage is altered by events happing elsewhere in the universe?

[ January 20, 2002: Message edited by: hedonologist ]</p>
hedonologist is offline  
Old 01-20-2002, 02:33 AM   #48
Veteran Member
 
Join Date: Aug 2000
Location: Australia
Posts: 4,886
Post

Quote:
Originally posted by hedonologist:
<strong>How about if someone had some way of getting into your brain and fiddling with your desires so that you felt that you wanted to be their slave? You would have no *knowledge* that they were doing this. For all you know you just enjoyed nothing more than to clean their house, file their bunions, etc.</strong>
That sounds pretty good actually... at least I could seek my desires in a straight-foward way.

Quote:
<strong>Do you never delete anything from your hard drive?</strong>
Neural networks involve single pieces of information being spread throughout many neurons, and a single neuron can help store many pieces of information. So basically it is kind of fuzzy and its interconnected. I think I read somewhere that we have the potential to store many, many lifetimes of information in our brains. And I try not to delete personal stuff (unless it is incriminating ) I only delete games and stuff that I no longer want... but I keep a memory of the name of it so that I can get it again if I like. Inside the brain, memories have quite fast access... I mean once you've accessed them you get a fairly full understanding of the concepts. But if you only just remembered some keywords, you'd have to look up that material again and relearn all the concepts again.
I think our large collection of memories is where we get our "common sense" intuitions and things like that from. Without it we'd be very naive and incompetent a lot of the time.

Quote:
<strong>Unreliability the cause of creativity? I'm finding this excuse to keep your old brain, dissatisfactory. Do you think you are the most creative person who has ever been? We got Einstein's brain copy here, Leonardo De Vinci, etc, all at barging rates....</strong>
Well their brains are filled with memories about their life and Einstein refused to believe in Quantum Physics (which had been around since the early 1900's). I think they're much more creative than me at invention or theoretical physics but I think I'm more creative than they are at other things. (pop culture holistic philosophy, etc)

Quote:
<strong>We can take creativity from one...</strong>
I think creativity is a result of a very strong desire for newness and this causes them to learn a large variety of patterns that describe the world. These patterns are recombined to satisfy their goals. (e.g. making a flying machine or wondering about the speed of light)
It would help to have a lot of their learnt patterns inserted into my brain so that my "feature set" of possible patterns would be much larger. It might take many nights of dreaming to integrate the new memory fragments.

Quote:
<strong>...and combine it with lighting fast clerical abilities, etc.</strong>
I think the best way of doing that is increasing the RAM - this would allow me to view a page at once and manipulate it in my mind very fast. It might take a while to get used to that capacity for really wide thought. I'd rather just use calculators to do menial tasks. Maybe I could learn to do calculations at high-speed - maybe increased RAM would improve learning... and an increased clock speed would be good too.

Quote:
<strong>Certainly you wouldn't rather keep that old thing. Show me a creativity test (or any other type of test) that your current brain can pass, which another brain could not do better.</strong>
Creativity is not just about doing well at creativity tests. It is about putting that into practice. But I get your point.

Quote:
<strong>It is a little too complicated currently, in the real world, but I'm speaking hypothetically to make a point. Say we could map and rebuild your whole brain down to beneath the level of a sub-atomic particle, knowing whatever may cause quantum mechanic fluctuations, etc.</strong>
I guess if it had the same personality/memories then it would be "me" but I don't see the point of reorganising the entire brain. If it made things more reliable or allowed much higher clock-speeds or something then I might consider it.

Quote:
<strong>Why not just scrap the whole "motivational system" along with the rest of the brain and start over?</strong>
Well I think all my memories are associated with emotional content from my motivational system. And if you started over, you might give things the wrong priorities and I mightn't act very normally. And I think my motivational system is mostly what makes me "me". It is what keeps me constantly searching for pleasure and avoiding pain in ways that I learnt through experience (and partly through instincts).

Quote:
<strong>So maybe some of the data will be stored in a different format-- the new brain could perform any *objective* function the old one did, and more. Aren't we better off to judge whether or not to upgrade our computer by using some objective sort of benchmark?</strong>
Well some things don't really have "correct" answers, like ethical dilemmas. Or perhaps the correct answer is the one my old brain would give. I think the best way of imitating that is to make the new brain have equivalent information stored in it with equivalent processes.

Quote:
<strong>So would I. I just wouldn't die for them (if I had a choice) and I wouldn't advocate that someone kill me even if I didn't know I would die.</strong>
Well I'd rather be alive then be copied and killed I think. It depends. If the copy has a perfect body, etc, I might do it and be a martyr and a copy of me would enjoy the benefits.

About the brain-transferring scenario:

Say you're called "A" and I'm called "B".

Originally brain A is in body A and brain B is in body B. Both brains and bodies are alive. How would you talk me (person B) into letting me put brain A into body B?
excreationist is offline  
Old 01-20-2002, 04:37 AM   #49
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

Jamie_L:
Quote:
Suppose you were put in suspended animation. Then an exact copy of you was made, also in suspended animation. The copy is put in a room with green walls. The original "you" is put in a room with red walls. Both copies are awakened at exactly the same time. What color walls will you see when you open your eyes?
It's impossible to say which colour "I" will see unless you specify which "I" you are talking about. In a sense I will see both red and green, though from my perspective I will only see only red or green.

[ January 20, 2002: Message edited by: tronvillain ]</p>
tronvillain is offline  
Old 01-20-2002, 05:25 AM   #50
Veteran Member
 
Join Date: Oct 2000
Location: Alberta, Canada
Posts: 5,658
Post

hedonologist:
Quote:
You made a useful distinction, between "info-matters" and "matter-matters". I'm making the case that "more than" info matters. If info were all that mattered to someone, I would think they could trade in their brain like it were a hard drive, so long as they had the same info and structure in the new brain. How could "you" not survive this, if "you" are only info?
Other than appealing to intuition, I don't see that you've made much of a case for "more than info-matters." As far as I can tell, "I" would survive the process you describe.

Quote:
If we can not be "sustained" as experiencers, but instead who we are at this moment is a different person than who we are the next moment (as one pole of the paradox seems to suggest), then is there any more sense in valuing or trying to sustain "ourselves" if "ourselves" die every moment regardless of what happens physically. So the question becomes, what data in our brains do we want to sustain? Do we want to "survive" as an act of altruism for "our future self", or for some other purpose (other data in the brain)?
We become different people from moment to moment, but this is nothing more than saying that if the person from one moment is held up next to the person from the next moment, then differences will exist. This is obviously true, but it has no apparent effect on the sense of self - "I" continue to exist over time. Why do I care about what happens to my future self? I will become him, just as my past selves have become be.

Quote:
I don't see any danger. It is like having an identical twin, to me, just more similar. When I contemplate the idea, I imagine that we are all just like copies of one person. Maybe I would act the same as anyone else, if I had their genetics and environment. Materialism certainly says so.
Well, you're not me, but I'm not totally sure you've thought through the implications of an exact duplicate. Who gets your life? Can you share it? Will one of you be willing to give some or all of it up?

A statement like "I would act the same as anyone else, if I had their genetics and environment" strips the word "I" of its meaning. Take away genetics and environment and what do you have left? Nothing. So it is difficult to see in what sense "we are all just like copies of one person", unless it is simply that we may have some things in common.
tronvillain is offline  
 

Thread Tools Search this Thread
Search this Thread:

Advanced Search

Forum Jump


All times are GMT -8. The time now is 08:40 AM.

Top

This custom BB emulates vBulletin® Version 3.8.2
Copyright ©2000 - 2015, Jelsoft Enterprises Ltd.