Joined: May 2002
Arn Moderator 4 wrote:
I've deleted a couple of non-substantial posts already. I'm going to pay special attention to this thread. No short quip posts. Substance and respect, please, if you will.
Thus my to the point and substantial posting in response to Dembski. If such approaches are considered spamming and worthy of envoking rule 6 then perhaps I should break up the posting on lets say 6 smaller pieces? But that seems contradictory to the 3 postings per day rule. In effect ARN moderator 4's suggestion would enforce a large posting instead of multiple smaller ones.
I find it fascinating how you are hopping from 'No Free Lunch' theorems as a foundation of your argument to 'No free lunch principle'.
The Design Inference laid the groundwork. This book demonstrates the inadequacy of the Darwinian mechanism to generate specified complexity. Darwinists themselves have made possible such a refutation. By assimilating the Darwinian mechanism to evolutionary algorithms, they have invited a mathematical assessment of the power of the Darwinian mechanism to generate life's diversity. Such an assessment, begun with the No Free Lunch theorems of David Wolpert and William Macready (see section 4.6), will in this book be taken to its logical conclusion. The conclusion is that Darwinian mechanisms of any kind, whether in nature or in silico, are in principle incapable of generating specified complexity. Coupled with the growing evidence in cosmology and biology that nature is chock-full of specified complexity (cf. the fine-tuning of cosmological constants and the irreducible complexity of biochemical systems), this conclusion implies that naturalistic explanations are incomplete and that design constitutes a legitimate and fundamental mode of scientific explanation.
and several other references to the importance of the NFL theorems.
It is interesting to find out that one of the fundamental principles of your book has now been side tracked when it became clear that the NFL theorems are likely not relevant for evolutionary mechanisms.It still saddens me that you accuse people of 'smuggling in' CSI, especially Tom Schneider who has succesfully disproven most of your claims about Ev.
So now the question is reduced to, smuggling in of complex specified information. How it such information 'smuggled in'? Your limited applicable "Conservation of information" theorem, also known as the second law of thermodynamics shows that in a closed system indeed information can only decrease. But what about an open system, information is imparted on the system by making choices, whether it be intelligent design or some vetting algorithm like natural selection which transforms information about the environment into the genome. The reason why this works so well for DNA is because it has some very useful properties, it can store historical information, it can copy/duplicate such information. Your argument that 'No free lunch' theorems/principles show that information/entropy can only increase/decrease through intelligent design is begging the question indeed. What is the difference between intelligent design and natural design I ask you? In both cases, choices are made that lead to increased correlation between the genome and the environment, hence information is transfered from the environment into the genome. Furthermore your argument about front loading becomes meaningless, where is this front loading supposed to have taken place? It must have happened before the laws of physics came into existence and in any event your approaches do not even allow us to distinguish between front loaded and intervention design. As Murray has so very aptly argued, this makes intelligent design nothing different from methodological naturalism.
So what do we have so far
1. NFL theorems are not really that important anymore
2. NFL principles are now the 'hot topic' of course they lack even more in mathematical foundation despite Dembski's assertion The No Free Lunch Principle states that if you have some naturalistic process whose output exhibits specified complexity, then that process was front-loaded with specified complexity. as argued by Wolpert, one of the authors of the NFL theorems
I say Dembski "attempts to" turn this trick because despite his invoking the NFL theorems, his arguments are fatally informal and imprecise. Like monographs on any philosophical topic in the first category, Dembski's is written in jello. There simply is not enough that is firm in his text, not sufficient precision of formulation, to allow one to declare unambiguously `right' or `wrong' when reading through the argument. All one can do is squint, furrow one's brows, and then shrug.
3. Conservation of information laws seem to be nothing different from the SLOT
Indeed, throughout there is a marked elision of the formal details of the biological processes under consideration. Perhaps the most glaring example of this is that neo-Darwinian evolution of ecosystems does not involve a set of genomes all searching the same, fixed fitness function, the situation considered by the NFL theorems. Rather it is a co-evolutionary process. Roughly speaking, as each genome changes from one generation to the next, it modifies the surfaces that the other genomes are searching. And recent results indicate that NFL results do not hold in co-evolution.
4. Intelligent design cannot distinguish itself from methodological naturalism when it is unable to distinguish between front loading and intervention.
5. Specified complexity seems to be a subjective and meaningless concept in that one could easily specify any chance/regularity hypothesis, leading to countless false positives. May I point in this context to the very to the point analysis by Sobel
As far as the flagellum is concerned Ken Miller has posted a prepublication of an article that will appear in volume entitled "Debating Design: from Darwin to DNA," edited by Michael Ruse and William Dembski, which will be published by Cambridge University Press volume early in 2003. In another pre publication Miller addresses "Answering the Biochemical Argument from Design".
From this second illustration can be gathered that Dembski.s theory enables a moderately imaginative person,
with a list of possible delimitations of an event, easily eliminate relevant chance-hypotheses for the event; if they all
make more probable that not its non-occurrence, and avoiding .false negatives. concerning relevant chance-
hypotheses for this event is somewhat (it need not be very) important to him. From the two illustrations, one may
gather that by the lights of Dembski.s book, we are entitled, and will always be entitled to conclude, that not much
happens by chance.
But what I find most telling is that the design inference has now retreated from trying to provide scientific contributions to a mere 'Intelligent design, in contrast to Darwinism, is not a theory about process but about creative innovation." It should not come as a surprise that intelligent design researchers have been less than succesful in finding funding for research into something that seems to be unable to contribute much to the scientific discussion.
Not surprisingly, Darwinism, which does propose real scientific pathways allowing us to extend our understanding of how life evolved, has a burden that indeed ID need not deal with, providing for hypotheses which can be disproven. Intelligent design, unable to even address if innovation occurs as an intervention or as some form of front loading has to deal with the fact that its foundations on NFL theorems, conservation of specified complexity and specification are falling apart fast. Dembski argues "The formal theory for specifications that I develop maps onto the biology unproblematically" but I have yet to see anything resembling such a theory. In fact the 'theory' seems to show that specification depends inherently on subjective interpretation and can in principle be extended to any hypothesis. (See also Sobel) The specification of the flagellum shows how meaningless 'specification' really is. It looks like an outboard engine. Well, the inner ear looks like a drum, can we now infer design for the ear? Snowflakes look like little sculptures once magnified enough. The sunset looks like an expressionistic painting. Need I say more?
To state that biology has remained empty handed in explaining biological complexity seems to show a tendency to ignore the known literature on these topics. But I doubt that Dembski is interested in discussing how ID fares compared to scientific inquiry into these topics, after all ID has no burden at all. Of course if Dembski applied his argument consistently he would have to argue that ID bears the burden of providing convincing evidence of design and its designers but ID is not interested in process and seems to be stuck detecting rarefied intelligent design using a faulty filter. See for instance the excellent article by Welsberry and Wilkins The advantages of theft over toil: the design inference and arguing from ignorance where they show how Dembski's filter fails to incorporate 'we don't know' and provides a priveleged and unwarranted position to the design inference.
That Dembski has abandoned much hope for a scientific breakthrough for ID seems obvious when he argues for political approaches instead. It seems that the 'Wedge strategy' is alive and well.. Of course Bruce Gordon seems to have realized how
See also Here
"... the theory has been prematurely drawn into discussions of public science education where it has no business of appearning without broad recognition from the scientific community ... inclusion of design theory as part of the standard discourse of the scientific community, if it ever happens, will be the result of a long and difficult process of quallity research and publication. It also will be the result of overcoming the stigma that has become attached to design research because of the anti-evolutionary diatribes of some of its proponents on the one hand and its appropriation for the purpose of Christian apologetics on the other. ...If design theory is to make a contribution to science, it must be worth pursuing on the basis of its own merits, not as an exercise in Christian 'cultural renewal,' the weight of which it cannot bear.... In conclusion, it is crucial to note that design theory is at best a supplementary consideration introduced alongside (or perhaps onto by way of modification) neo-Darwinian biology and self-organizational complexity theory. It does not mandate the replacement of these highly fruitful research paradigms, and to suggest that it does is just so much overblown, unwarranted, and ideologically driven rhetoric." Bruce Gordon, ex-CRSC Fellow, Science and Theology 2:1 (2001), p. 9
Perhaps Dembski may help us understand where he believes ID should be going, other than following the inevitable political route now that the bridges to a scientific route seems to have been burned effectively by Dembski's latest admissions of what ID is and isn't. I had some hopes that ID would provide for a positive research program that would expand our understandings but that does not seem to be a burden ID is willing or able to carry.
I am also interested in why, if Dembski believes that the rules of engagement are fixed in favor of evolutionists, he did not invite Wesley Elsberry for instance or Richard Wein to present their case at the last RAPID meeting? Seems that ID has only itself to blame here.
[PS: I will be addressing some historical revisionism of Behe's IC "This becomes immediately evident from reading Behe since in his definition of irreducible complexity, the function of the system in question always stays put." and many other interesting issues raised by Dembski. ]
And some reference about evolution and biological complexity
ev: Evolution of Biological Information
Evolution of Biological Complexity
Genomic Complexity in Micro Organisms and Digital Organisms
Some Techniques for the Measurement of Complexity in Tierra
The Evolution of Complexity and the Value of Variability
Complexity and Self-Organization
The hypothesis that environmental variability promotes the evolution of organism complexity is explored and illustrated, in two contexts. A coevolutionary `Iterated Prisoner's Dilemma' (IPD)
ecology, populated by strategies determined by variable length genotypes, provides a quantitative demonstration, and an example from evolutionary robotics (ER) provides a more qualitative and naturalistic exploration.
In the ER example, the above hypothesis
is illustrated in real environments, and the organism complexity is seen in robots exhibiting relatively complex behaviours and neural dynamics.
Implications are drawn for the emergence of complexity in general, and also for artificial evolution as a design methodology.
What is complexity
The physical complexity of a sequence refers to the amount of information that is stored in that sequence about a particular environment. For a genome, this environment is the one in which it replicates and in which its host lives, a concept roughly equivalent to what we call a niche.
Information is a statistical form of correlation, and thus requires, mathematically and intuitively, a reference to the system that the information is about.
In this paper Adami clarifies many of the concepts relevant to complexity such as information, entropy etc.
As we saw above, information is revealed, in an ensemble of adapted sequences, as those symbols that are conserved (fixed) under mutational pressure. Imagine then that a beneficial mutation occurs at a variable position. If the selective advantage that it bestows on the organism is sufficient to fix the mutation within the population,(24) the amount of information (and hence the complexity) has increased. A beneficial mutation that is lost before fixation does not decrease the amount of information, nor does this happen if a neutral mutation drifts to fixation.