Freethought & Rationalism ArchiveThe archives are read only. |
02-24-2003, 03:51 PM | #1 |
Regular Member
Join Date: Feb 2003
Location: Amman, Jordan
Posts: 258
|
No Free Lunch
William Dembski, and Michael Behe are the two leading figures in the so called Intelligent Design (ID) movement. They claim that neo-Darwinism cannot explain the origin and subsequent evolution of life, which can only be explained by ID.
Dembski published his latest book "No free Lunch" in which he argues that the source of information in Nature cannot arise de novo. There has to be a designer in order for information to exist, it does not just emerge. He also uses the concept of irreducible complexity for certain structures in biological organisms and shows that they could not have arisen in the multiple steps acted upon by Natural Selection. Personally. I find the mathematical arguments elegant but fundamently flawed. The use of the word information by Dembski and Behe is defined according to Shannon's information theory, which itself invokes design. It is like a self reinforcing argument. The basic assumption they make have nothing to support them. Does anyone like to discuss the No Free Lunch Hypothesis? |
02-24-2003, 04:21 PM | #2 |
Veteran Member
Join Date: Jul 2002
Location: East Coast. Australia.
Posts: 5,455
|
The no free lunch hypothesis is a thought experiment that contradicts the well known results of real life experiments and naturally occuring examples. This is one of the calling cards of a quack scientist.
|
02-24-2003, 04:32 PM | #3 | |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
Quote:
But in general you're right. Dembski's thesis is an argument from ignorance. You've read this Talk.Origins FAQ by Richard Wein, right? EDIT: fixed link |
|
02-24-2003, 06:05 PM | #4 | |
Regular Member
Join Date: Feb 2003
Location: Amman, Jordan
Posts: 258
|
Quote:
The way Dembski and Behe use this definition to invoke design is by saying that some force should do the reduction of uncertainty. Or this was at least my understanding. Thanks alot for the link you sent me, this is exactlywhat I was looking for |
|
02-24-2003, 06:41 PM | #5 |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
MyKell,
Shannon's information metric is typically defined as: Code:
H = - sum p_i * log p_i, for all elements i = 1 to N |
02-24-2003, 06:53 PM | #6 |
Regular Member
Join Date: Feb 2003
Location: Amman, Jordan
Posts: 258
|
Thanks for the note, that cleared up some of the confusion.
I think I was confused about how information content relates to its complexity. Obviously, they are not related. The chaotic pattern of snow on a TV screen has maximum information content but very low complexity, right? |
02-24-2003, 07:04 PM | #7 |
Veteran Member
Join Date: Mar 2002
Location: anywhere
Posts: 1,976
|
This is a difficult question, since the quantification of complexity itself is a whole field of research unto itself. Complexity in certain senses (e.g. Kolmogorov-Chaitin) measures the "size" of the minimal process required to generate a given output. But, these measures are notoriously difficult to quantify. I'd suggest doing a google search on the topic, or perhaps one of the more versed lurkers will pop in (RBH or lpetrich, you there?).
The main twist Dumbski offers to this field is the notion of "specified" complexity. And the problem he is trying to circumvent, imo, is the common [though mistaken] conflation of information measures and semantic content. So, according to him, design is real if it is complex (which to him, merely means that it is improbable to occur by chance and natural laws) and if it is specified (i.e. that it has "meaning" in some subjective sense.) You're probably better versed in Dumbskian then I am at this point, since you recently finished reading NFL. So if I got Dumbski wrong, I don't care. |
02-24-2003, 09:13 PM | #8 |
Contributor
Join Date: Aug 2002
Location: Ohio
Posts: 15,407
|
Complexity Measures
IIRC, in The Design Inference Dembski started out using "specified improbability," but shifted to "specified complexity" with the same meaning as improbability. I wish he had stuck with the improbability locution, but "improbable" ain't as sexy as "complex," and he couldn't (unjustifiably) piggyback on the connotations of "complex" if he used "improbable."
For some background on complexity measures, see the links suggested here. They're down close to the bottom of the page. RBH |
02-25-2003, 05:34 AM | #9 |
Contributor
Join Date: Feb 2002
Location: With 10,000 lakes who needs a coast?
Posts: 10,762
|
Kent Hovind could testify that not only can you get a free lunch, but you don't have to pay taxes on it.
|
02-25-2003, 08:48 AM | #10 | |
Veteran Member
Join Date: Jan 2001
Location: Santa Fe, NM
Posts: 2,362
|
Quote:
Dumbski plays fast and loose with the interpretation of the theorem, (for example, adding on the assertion that they somehow make evolutionary algorithms infeasable), but that's not the theorem's fault. |
|
Thread Tools | Search this Thread |
|