Freethought & Rationalism ArchiveThe archives are read only. |
12-21-2002, 08:10 AM | #1 |
Contributor
Join Date: May 2002
Location: Saint Paul, MN
Posts: 24,524
|
Information theory.
Having seen the "no new information" argument a dozen or more times, I have a suggestion:
Rather than getting bogged down in debates over whether or not we've seen "new information", perhaps we should start by *defining* information. It seems to me that, if we adhere to a rigorous definition similar to that used in information theory, we will magically make the problem go away, because it's quite clear that mutations "add information" in such a model. |
12-21-2002, 08:51 AM | #2 |
Veteran Member
Join Date: Jan 2001
Location: UK
Posts: 1,440
|
Well pretty much - there are many different ways to define information, several to define randomness... but I've yet to see any actual mathematical treatment of information but creationists, except Dembski's filter, which I have to say I've never actually seen a direct copy of...
I'd certainly be interested... |
12-21-2002, 09:28 AM | #3 |
Contributor
Join Date: May 2002
Location: Saint Paul, MN
Posts: 24,524
|
I think a good starting point would be the compsci definition. "Information is the minimal number of bits to express something".
With this definition, it is immediately obvious that most mutations add information. |
12-21-2002, 10:15 AM | #4 |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
Off-hand, there are several definitions of information that come to mind, such as Shannon's (which I believe is the one seebs is using) or Kolmogorov's. In a previous discussion, I had a link that discussed just how nebulous the concept of information really is -- I'll look for the lost site.
From the many threads that I have read where IDiots and Creationists invoked the concept of information, it is usually associated with an agency (Designer, God, man, etc.). But imo, that is just their way of arbitrarily defining information to require an agency so that they can demonstrate a need for an agency (eh? circular argument?). The other main point that is usually lost is just how one measures information in a system. The notion is often tossed around without specifying any rigorous and practical manner to quantify it. For instance, anyone who has played with fractals will know that any given fractal image requires non-zero information to capture. But, where is the information in the image -- is it in the arrangment of numbers that makeup the image itself, or perhaps the equation that describes it? Anyway, certain information measures are not typically computable. Well, I'm not sure where seebs is taking this, but there is my 2 cents. [ December 21, 2002: Message edited by: Principia ]</p> |
12-21-2002, 10:24 AM | #5 |
Contributor
Join Date: May 2002
Location: Saint Paul, MN
Posts: 24,524
|
The nice thing is that you can generalize a bit from Shannon's description. I'd say, if we use the equation used to calcualate the fractal (along with any information about resolution and iterations), that's a good starting point.
The neat thing is, this is probably the most correct way to approach the question of where "new information" enters the genome; any mutation that we would instinctively think of as "increasing information" probably adds at least one bit of information. |
12-21-2002, 10:34 AM | #6 | |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
Quote:
[ December 21, 2002: Message edited by: Principia ]</p> |
|
12-21-2002, 10:40 AM | #7 |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
I found <a href="http://www.ils.unc.edu/~losee/ci/node2.html" target="_blank">the site</a>. It's a non-technical essay, but an interesting read.
|
Thread Tools | Search this Thread |
|