In an Evolution News & Views...In case you had any uncertainty about the upcoming 13-part Cosmos series, a revival of the Carl Sagan franchise, executive producer Seth MacFarlane has Darwin skeptics and alternatives to Darwinian evolution very much in his crosshairs. This is a major and costly project, though Fox won't say how costly - so it's flattering in a way. In an interview in the Los Angeles Times, MacFarlane says:
We've had a resurgence of creationism and intelligent design quote-unquote theory. There's been a real vacuum when it comes to science education. The nice thing about this show is that I think that it does what the original "Cosmos" did and presents it in such a flashy, entertaining way that, as Carl Sagan put it in 1980, even people who have no interest in science will watch just because it's a spectacle. People who watched the original "Cosmos" will sit down and watch with their kids.
How to build an animal
Whereas the focus in Part 1 falls on fossil evidence for an explosion of life in the Early Cambrian, we change gear in Part 2 and examine biological research relevant to the origin of animal phyla.
The starting point is the search for ways of measuring biological information representing different body plans. Shannon's theory of information (when applied to the animal genome) has the merit of mathematical rigour, but Meyer shows that this approach gives insight only into a sequence's capacity to carry information. Whether the sequence is functional is undetermined ? so discussion of biological information must extend far beyond quantitative measures. Meyer discusses the number of cell types as an indicator of complexity of embedded information. With reference to the genome, which uses digital codes, he uses the term "specified information", meaning that a genetic sequence can only be functional if the codons have a specific arrangement. Is the neo-Darwinian mechanism adequate to explain the origins of novel specified information associated with the Cambrian Explosion? Meyer describes this as a challenging question for Darwinists and claims that the necessity of "vast amounts" of specificity makes their explanations implausible.
To show that this argument is real, and not an argument from ignorance, Meyer devotes the next chapter to unpacking the issues surrounding specificity. In the early 1960s, Murray Eden (a professor of engineering and computer science at MIT) realised that there was a problem with neo-Darwinian theory and organised a conference to explore the issues at the Wistar Institute in Philadelphia. The theme was: "Mathematical challenges to the neo-Darwinian interpretation of evolution". The participants came from many disciplines and included Ernst Mayr (one of the architects of neo-Darwinism) and Richard Lewontin (Professor of genetics and evolutionary biology). Chairing the meeting was the Nobel laureate Sir Peter Medawar. The discussion provided by Meyer is extremely helpful in clarifying the nature of the problems and summarising some of the suggestions for resolving the dilemmas. The most favoured possible solution is explained in the quotation below, and is significant for stimulating a design-based research programme discussed in the subsequent chapter.
"The solution was this: even though the size of the combinatorial space that mutations needed to search was enormous, the ratio of functional to non-functional base or amino-acid sequence in their relevant combinatorial spaces might turn out to be much higher than Eden and others had assumed. If that ratio turned out to be high enough, then the mutation and selection mechanism would frequently stumble onto novel genes and proteins and could easily leapfrog from one functional protein island to the next, with natural selection discarding the non-functional outcomes and seizing upon the rare (but not too rare) functional sequences." (page 178)
As a research student in the late 80s, Doug Axe was not persuaded by Dawkins' rhetoric in "The Blind Watchmaker", and wanted to undertake research himself into aspects of genetic information. Reading the proceedings of the Wistar Conference stimulated many ideas for further work. This led Axe to join a protein engineering team at the University of Cambridge. Meyer's discussion of his experiments and results need to be read in full to appreciate the robustness of the empirical work undertaken. However, this is the conclusion of the first phase of Axe's research:
"Overall, therefore, he showed that despite some allowable variability, proteins (and the genes that produce them) are indeed highly specified relative to their biological functions, especially in their crucial exterior portions. Axe showed that whereas proteins will admit some variation at most sites if the rest of the protein is left unchanged, multiple as opposed to single amino-acid substitutions consistently result in rapid loss of protein function." (p.193)
In the next chapter, Meyer himself appears as part of the story-line. The year is 2004, when the Proceedings of the Biological Society of Washington carried Meyer's peer-reviewed article that made reference to Axe's work and the Cambrian Explosion dilemma. He argued that "the theory of intelligent design could help explain the origin of biological information" (p.209). In Meyer's own words, the publication of this paper created "a firestorm of controversy". Up to that time, opponents of intelligent design (ID) claimed that until ID made it into peer-reviewed literature, it could not count as science. Once they realised it had passed through, they left no stone unturned in trying to discredit the paper, the journal's editor and their peer-review process. Many months passed before anything looking like a scientific response appeared, drawing heavily on a 2003 review of thinking about the origin of new genes. Meyer devotes the rest of this chapter to analysing the arguments and showing that the research does not explain the origin of specified information and does not solve the combinatorial inflation problem identified by Murray Eden.
"Overall, what evolutionary biologists have in mind is something like trying to produce a new book by copying the pages of an existing book (gene duplication, lateral gene transfer, and transfer of mobile genetic elements), rearranging blocks of text on each page (exon shuffling, retropositioning, and gene fusion), making random spelling changes to words in each block of text (point mutations), and then randomly rearranging the new pages. Clearly, such random rearrangements and changes will have no realistic chance of generating a literary masterpiece, let alone a coherent read. That is to say, these processes will not likely generate specificity of arrangement and sequence and, therefore, do not solve the combinatorial search problem. In any case, all such scenarios also beg the question. There is a big difference between shuffling and slightly altering pre-existing sequence-specific modules of functional information and explaining how those modules came to possess information-rich sequences in the first place." (p.219)
Neo-Darwinians are remarkably satisfied with natural selection and their hypothetical models of gene evolution, so that platitudes often replace science. Meyer gives an example from an evolutionary text-book: "One need not go into the details of the evolution of the bird's wing, the giraffe's neck, the vertebrate eye, [. . .] Even a slight advantage or disadvantage in a particular genetic change provides a sufficient differential for the operation of natural selection." (quoted on p.234). Anyone who wants to grapple with the details soon meets problems that cast doubt on the adequacy of Darwinian mechanisms. Meyer introduces us to Tom Frazzetta, whose specialism is functional biomechanics. He found great difficulty defending the concept of gradual change because all the intermediate forms he could envisage would not have been viable. The interdependence of biomechanical systems meant that design changes could not be incremental and many would have to occur concurrently. Frazzetta came to the conclusion that "Phenotypic alteration of integrated systems requires an improbable coincidence of genetic (and hence hereditable phenotypic) modifications of a tightly specified kind." (quoted on p.233). This brings us to the work of Michael Behe and David Snoke, and their 2004 paper in Protein Science. They recognised that some inferred evolutionary changes require coordinated mutations, and they used the principles of population genetics to assess the likelihood of such coordinated changes occurring. The calculated probabilities are so low as to cast doubt on this being a widespread phenomenon in the history of life. Behe was to return to this theme later in his book: The Edge of Evolution (2007).
"In a real sense, therefore, the neo-Darwinian math is itself showing that the neo-Darwinian mechanism cannot build complex adaptations - including the new information-rich genes and proteins that would have been necessary to build the Cambrian animals." (p.254)
At this point, the focus of interest shifts from molecules to body plans; from population genetics to developmental biology. Paul Nelson (philosopher of biology) is introduced when commenting on the "great Darwinian paradox". This is the observation that mutations affecting early stage development are not beneficial, yet these are the very mutations needed if there is to be any change in the body plan. In Nelson's words:
"Such early-acting mutations of global effect on animal development, however, are those least likely to be tolerated by the embryo and, in fact, never have been tolerated in any animals that developmental biologists have studied." (p.262).
Early stage development appears to be overseen and coordinated by developmental gene regulatory networks, a concept pioneered by Eric Davidson. It is not a coincidence that developmental biologists like him have been pressing for a new evolutionary synthesis to emerge, because they are acutely aware that neo-Darwinism cannot be the way forward. The tightly integrated gene regulatory networks cannot be mutated incrementally so as to produce new body plans:
"contrary to classical evolution theory, the processes that drive small changes observed as species diverge cannot be taken as models for the evolution of the body plans of animals." (words of Davidson, quoted on p.269).
The challenge to the neo-Darwinian synthesis is even more formidable than this. The mindset of Darwinists is that life is digital. Everything is reduced to bits in the genome sequence. However, what happens to the adequacy of their theory if they are dealing with only part of the information story? What happens is some information is located in the cell independent of the genome? At very least, if this is true, the textbook orthodoxy can only claim to be a partial account of origins. But it also needs to be considered whether neo-Darwinism is a diversion to the real issues affecting life's diversity. These matters are discussed in Meyer's chapter dealing with the epigenetic revolution.
"Many biologists no longer believe that DNA directs virtually everything happening within the cell. Developmental biologists, in particular, are now discovering more and more ways that crucial information for building body plans is imparted by the form and structure of embryonic cells, including information from both the unfertilized and fertilized egg." (p.275)
Much of this chapter draws on the work of Jonathan Wells, whose analysis of the inadequacy of neo-Darwinian theory incorporates the growing evidence that epigenetic influences on development are substantial. (See also here.)
"Yet both-body plan formation during embryological development and major morphological innovation during the history of life depend upon a specificity of arrangement at a much higher level of the organizational hierarchy, a level that DNA alone does not determine. If DNA isn?t wholly responsible for the way an embryo develops - for body-plan morphogenesis - then DNA sequences can mutate indefinitely and still not produce a new body plan, regardless of the amount of time and the number of mutational trials available to the evolutionary process. Genetic mutations are simply the wrong tool for the job at hand." (p.281)
A particularly useful aspect of these chapters is that ID-related research is presented in a way that demonstrates the coherence and value of the design paradigm. Researchers operating within a design framework are addressing issues that are of central importance, publishing their work in peer-reviewed papers and other scholarly forums, and engaging in a constructive discourse with scientists working within the naturalistic evolutionary paradigm. Many will be aware of the work of individual scientists mentioned above, but Meyer's account shows how they contribute to the bigger picture and complement one another. This approach to science is exemplary and one hopes it will inspire young scientists to emulate their endeavours.
Where does this lead us? For the answer to that question, we must turn to Part 3 of Meyer's book.
"[T]he Cambrian explosion now looks less like the minor anomaly that Darwin perceived it to be, and more like a profound enigma, one that exemplifies a fundamental and as yet unsolved problem - the origination of animal form." (p.287)
To be continued.
Darwin's Doubt: The Explosive Origin of Animal Life and the Case for Intelligent Design
by Stephen C. Meyer
HarperOne (HarperCollins), New York, 2013. 520 pp. ISBN 9780062071477.
Readers of Uncommon Descent will recall that mid-20th century Christian apologist C.S. Lewis's views on Darwinism and scientism have attracted considerable interest of late. And some misrepresentation as well, as some zealous followers of Darwin have tried to claim him as one of their own.
For His Substance-Free Contribution to the Debate with Stephen Meyer, American Spectator Readers Pummel John Derbyshire
As reported in ENV...Congratulations to The American Spectator for having such sensible readers. Sometimes it's gratifying to find that the people who should know better actually do.
In January, the conservative magazine featured paired articles by Stephen Meyer and John Derbyshire arguing respectively for and against intelligent design. Derbyshire "argued" only in the limited sense of tossing off snide insults and trying to paint ID absurdly with the brush of "Occasionalism," a medieval theological concept.
Christian Post contributor Anugrah Kumar writes that Casey Luskin, a proponent of Intelligence Design, says that most theistic evolutionists appear to be unfamiliar with what ID theorists say, and they wrongly maintain that it's a "God of the gaps" argument.
The new big debate might be multiverse cosmologist Sean Carroll vs. Christian apologist William Lane Craig (who dismisses multiverses). It is sponsored by Greer Heard Point Counterpoint Forum. February 21-22, live streaming.
University College London wanted to honor the father of evolutionary theory on his 205th birthday, but they couldn't seem to do it without intelligent design.
The goal of REASONS 2014 will be to demonstrate the beautiful compatibility and synergy of the natural sciences and orthodox Christianity.
Speakers at this Houston area event include...
Stephen C. Meyer
William A. Dembski
John G. West
Bruce L. Gordon
Nancy R. Pearcey
And Houston Baptist University Faculty:
Holly E. Ordway
Melissa Cain Travis
Here's an Interesting and Worthwhile Scientific Volume Advocating, and Challenging, "Intelligible Design"
An endorsement on ENV of the book, which is a collection of philosophical, historical, mathematical, and scientific essays on design in nature. Many of the chapters are written by scientists from outside the United States, with Spain being especially well represented, who are friendly to intelligent design. However, not all of the chapters defend ID. Some of the authors critique ID, or claim it's impossible to scientifically detect design in nature. But even the criticisms are thoughtful, making this volume a worthy addition to anyone's collection of ID-related books.
A South Dakota senator on Thursday killed his proposal that would have allowed teachers to discuss whether a higher power played a role in creation, saying it was too poorly written to pass.
The Senate Education Committee voted to kill the measure, which sought to ensure teachers could tell students about intelligent design, the belief that a higher power must have had a hand in creation because life is too complex to have developed through evolution alone.
The past 10 years has witnessed the rise of New Atheism, particularly in the US and the UK, with leaders who write best-selling books and attract a vociferous following. No doubt the sociologists of science will come up with some interesting things to say about this movement, but it is highly significant that the New Atheists have created deep divisions within their own intellectual community. The latest salvo expressing discontent has been fired by Massimo Pigliucci, evolutionary biologist, philosopher of science and advocate of atheism. In an academic paper, Pigliucci argues that the term "new" does not have anything to do with the public advocacy of atheism. Nor is there novelty in the arguments they use to advance their atheistic claims. However, Pigliucci is able to identify two distinctive characteristics of the New Atheists:
"[The first] is to be found in the indisputably popular character of the movement. All books produced by the chief New Atheists [. . .] have been worldwide best sellers, in the case of Dawkins's God Delusion, for instance, remaining for a whopping 51 weeks on the New York Times best-seller list. While previous volumes criticizing religion had received wide popular reception (especially the classic critique of Christianity by Bertrand Russell), nothing like that had happened before in the annals of Western literature. [. . .]
[Secondly, W]hat I see as a clear, and truly novel, though not at all positive, "scientistic" turn that it marks for atheism in general. [. . .] [We will] explore some examples of what I term the "scientistic turn" that has characterized some (but not all) New Atheist writers (and most of their supporters, from what one can glean from the relevant social networks)." (p.144)
New atheists say they trust science, but they redefine science so it cannot lead them to recognise design in nature. (Source here)
It is the second of these characteristics that elicits protestation from Pigliucci: the New Atheists are advancing ideas that call for a firm rebuttal. There is a strong tendency for these new leaders to be rather disparaging of philosophical arguments and base their polemics on the claims that science is the exclusive route to knowledge and that the findings of science supports the atheist position. According to Pigliucci, their approach necessitates a re-defining of science, and he argues that the new definition is indistinguishable from scientism.
"The New Atheism approach to criticizing religion relies much more forcefully on science than on philosophy. Indeed, a good number of New Atheists (the notable exception being, of course, Daniel Dennett) is on record explicitly belittling philosophy as a source of knowledge or insight. Dawkins says that the "God hypothesis" should be treated as a falsifiable scientific hypothesis; Stenger explicitly - in the very subtitle of his book - states that "Science shows that God does not exist" (my emphasis); and Harris later on writes a whole book in which he pointedly ignores two and a half millennia of moral philosophy in an attempt to convince his readers that moral questions are best answered by science [. . .]. All of these are, to my way of seeing things, standard examples of scientism. Scientism here is defined as a totalizing attitude that regards science as the ultimate standard and arbiter of all interesting questions; or alternatively that seeks to expand the very definition and scope of science to encompass all aspects of human knowledge and understanding." (p. 144)
The paper discusses a great diversity of issues, but we shall note only comments relating to Richard Dawkins' The God Delusion. Two of the issues identified are: his discussion of morality without gods, and his tackling of "the god hypothesis" to show that science-based evidence allows the hypothesis to be rejected. Regarding morality without gods, Pigliucci is unimpressed. Not only does Dawkins pass over the Greek philosophers, he does not interact with more contemporary secular thinkers who have contributed to this issue. Pigliucci draws attention to the Euthyphro dilemma, which stimulates the question: Is an action moral because the gods decree it, or do the gods decree it because it is moral? Plato and Aristotle came to the view that morality takes precedent over divinity. Atheists make a distinction between religion and morality and argue that morality should not be based in the will of a god.
"When it comes to the issue of why being moral, however, Dawkins shows most clearly his limitations. For instance, he seems to be unaware of what many philosophers consider by far the most powerful argument in favor of the idea that gods and morality are entirely logically independent issues: the so-called Euthyphro dilemma posed by Plato in the homonymous dialogue from 24 centuries ago." (p.150)
What Pigliucci might have noted is that the Dawkins promotes the distinction between religion and morality even though he does not use philosophical arguments to justify it. This is why he (and new atheists generally) make frequent references to Old Testament practices, such as slaying enemies and keeping slaves, to show that our sense of morality is such that we would not want to worship the Old Testament God even if he exists. Pigliucci wants to rest his case on the arguments made by secularist moral philosophers; Dawkins wants to rest his case on a scientific assessment of what is morally right; but neither of them engage with the responses of Christian moral philosophers to the Euthyphro dilemma and the secularists, so resting their case appears somewhat premature. (For a brief response, see here. For some discussion, see here).
Turning to "the god hypothesis", Pigliucci acknowledges that Dawkins uses a scientific approach to "make ideas like a young earth, or the slightly more sophisticated concept of "irreducible complexity" championed by Intelligent Design proponents, clearly untenable". Whilst this smacks of spin to members of those groups, the essential point is that there are aspects of YEC and ID that can be addressed using the tools of science. But is the existence of God amenable to scientific investigation? Can God ever be the subject of a scientific hypothesis? Pigliucci thinks not.
"The real issue is that Dawkins (and most if not all of the New Atheists) does not seem to appreciate the fact that there is no coherent or sensible way in which the idea of god can possibly be considered a "hypothesis" in any sense remotely resembling the scientific sense of the term. The problem is that the supernatural, by its own (human) nature, is simply too swishy to be pinpointed precisely enough." (p.148)
No doubt, this challenge to Dawkins' thinking could be discussed further. It might make the point clearer if it was said that science deals with the behaviour of material things, and God is not material - but the Creator of material things. So science cannot put God under the microscope, nor can experiments be devised to test whether he exists. However, this conflicts with the premise of the new atheists that science is the only route to knowledge. Pigliucci draws together his objections to Dawkins' scientism with these words:
"To recap, then, what is considered to be perhaps the quintessential text of the New Atheism is an odd mishmash of scientific speculation (on the origins of religion), historically badly informed polemic, and rehashing of philosophical arguments. Yet Dawkins and his followers present The God Delusion as a shining example of how science has dealt a fatal blow to the idea of gods." (p.148)
The following quotes summarise the conclusions about scientism. They have already antagonised some of the new atheists and Pigliucci responds to their criticisms here. But more generally, these conclusions are relevant to all who have a stake in the scientific enterprise.
"1. Scientism is philosophically unsound. This is because a scientistic attitude is one of unduly expanding the reach of science into areas where either it does not belong [. . .] or it can only play a supportive role. [. . .] What I do object to is the tendency, found among many New Atheists, to expand the definition of science to pretty much encompassing anything that deals with "facts", loosely conceived. So broadened, the concept of science loses meaning and it becomes indistinguishable from just about any other human activity." (p.151)
"2. Scientism does a disservice to science. Despite representing a strong attempt to expand the intellectual territory, as well as prestige, of science, I think that scientism is detrimental to science in at least two ways: internally to the discipline itself, because it represents a misunderstanding of what science is and how it works, which is unlikely to serve well either practicing scientists or graduate students as scientists-in-training; externally because it has the potential of undermining public understanding and damaging the reputation of science." (p.152)
"3. Scientism does a disservice to atheism. Finally, I maintain that a scientistic turn does not do much good to atheism as a serious philosophical position to begin with, contra the obvious explicit belief of many if not all of the New Atheists." (p.152)
As well as drawing attention to the arguments presented in this paper, there are two issues worthy of highlighting here. The first has reference to the place of philosophy in developing an informed mind and a mature judgment. It is noticeable how the scientistic rejection of philosophy is gaining ground. In 2010, Stephen Hawking, in The Grand Design, announced that philosophy was "dead" because it had "not kept up with modern developments in science, particularly physics". A year ago, Lewis Wolpert took the side of scientism when discussing Hawking's views (video here). It is to science leaders like these that Pigliucci's essay is directly relevant. He boldly charges them with anti-intellectualism, because their ideology has made them closed minded about the work of others ploughing in different fields.
"Moreover, it seems clear to me that most of the New Atheists (except for the professional philosophers among them) pontificate about philosophy very likely without having read a single professional paper in that field. If they had, they would have no trouble recognizing philosophy as a distinct (and, I maintain, useful) academic discipline from science: read side by side, science and philosophy papers have precious little to do with each other, in terms not just of style, but of structure, scope, and range of concerns. I would actually go so far as to charge many of the leaders of the New Atheism movement (and, by implication, a good number of their followers) with anti-intellectualism, one mark of which is a lack of respect for the proper significance, value, and methods of another field of intellectual endeavor." (p.152)
The second issue is one where I consider Pigliucci to have underplayed the influence of Darwinism on the rise of atheism. In presenting a historical perspective on atheism, the intellectual leaders are identified as philosophers. This is indicated in the following sentence:
"Even in the twentieth century, that is, before the early twenty-first century advent of New Atheism, the ball was still firmly in the philosophical park when it came to defense of or apologia for atheism: just consider the writings of A. J. Ayer, John Dewey, and, naturally, Bertrand Russell." (p.146)
The impact of Darwin's theory on the acceptance of atheism needs a more thorough discussion than is provided in the paper. Pigliucci does acknowledge that science is not irrelevant to atheism.
"On the contrary, atheism makes increasingly more sense the more science succeeds in explaining the nature of the world in naturalistic terms. After all, Hume's arguments against intelligent design were devastating, but he lacked an alternative explanation for the appearance of design in nature, and it was Darwin that provided it. Indeed, I think the Hume - Darwin joint dispatching of ID is an excellent example of how naturalism - qua philosophical position - is the result of the inextricable link between sound philosophy and good science." (p.152-3)
This quote refers to naturalism rather than atheism, and this is to be commended. Before Darwin, we had the Enlightenment, whereby naturalism replaced Christian Theism as the philosophical stance of scientists. Naturalism led to a wave of religious scepticism - but this was expressed within a Deistic worldview. Most of the Enlightenment scholars were still impressed by the pervasive evidences of design and they reconciled this with naturalism by allowing a creative act at the beginning. However, Darwinism brought immense changes - not to the naturalistic philosophy of science (which was already widespread), but in sweeping away Deism (which was perceived as a god-of-the-gaps blunder). Darwinism then made it possible to be an intellectually-fulfilled atheist, which is what has set Dawkins on his journey to new atheism.
Pigliucci rightly points out that science cannot be extended to cover all aspects of knowledge. However, in calling for a tighter definition of science, he needs to give greater emphasis to the philosophical underpinnings of science. A science that presumes naturalism MUST necessarily end up as an atheistic science. It fails as science because this approach presumes what it then claims science has confirmed. This means that naturalistic science is not objective and is not able to follow the evidence wherever it leads. For example, this is why the advocates of abiogenesis focus their efforts on chemical evolution, as this is the only avenue that naturalistic science permits researchers to follow. Consequently, the information characteristics of life are underplayed and they hope for information to arise by currently unknown emergent processes. The evidence however, points to complex specified information being fundamental to life, which naturalistic science cannot concede. By contrast, theistic science does not prescribe or predetermine outcomes, but it can handle natural processes as well as recognise intelligent agency. We will make progress when multiple working hypotheses can be tested without prescribing philosophical presuppositions for science. This is where education should be heading, not enforcing naturalism as the essence of science.
New Atheism and the Scientistic Turn in the Atheism Movement
Midwest Studies in Philosophy, 37 (1):142-153 (2013)
Abstract: The so-called "New Atheism" is a relatively well-defined, very recent, still unfolding cultural phenomenon with import for public understanding of both science and philosophy. Arguably, the opening salvo of the New Atheists was The End of Faith by Sam Harris, published in 2004, followed in rapid succession by a number of other titles penned by Harris himself, Richard Dawkins, Daniel Dennett, Victor Stenger, and Christopher Hitchens.
Pigliucci, P. On Coyne, Harris, and PZ (with thanks to Dennett), Rationally Speaking (5 February 2014)
IN ENV...this point was registered: Whatever you think of the Ham-Nye debate or the presenters, intelligent design was off-topic.
Casey Luskin of the Discovery Institute gives his take on the debate. "Ham talked about some science here and there, but almost all of what he said focused on trying to support a young earth viewpoint. Since he's not a scientist, the great majority of his arguments amounted - over and over again - to "Because the Bible says so." Nye's main argument was, "Because the evidence says so," and he cited a lot of reasonable evidence for an old earth. While Ham did make a few effective points that you don't have to accept evolution to do good science, the compelling scientific evidence for design in nature got skipped over".
Redundancy in the genetic code has long been recognised. Most amino acids can be specified in multiple ways (2-6 synonymous codons). More recently, it has also become known that synonymous codons are non-random, stimulating thought as to why this should be (see here). Since codon usage biases characterise both prokaryotic and eukaryotic genomes, is it possible that they are accidents of evolutionary history? This seems to be ruled out by pervasive evidences of conservation. Since the biases are not removed by mutations, it is inferred that "observed codon preferences in mammalian genomes [. . .] appear to be under selection" (p.1367.) Such a conclusion is reached by deduction from evolutionary theory. If specific (synonymous) codons do not matter when manufacturing proteins, is it possible they are relevant to the regulation of genetic processes? Since there is a presumption favouring simplicity in the minds of most geneticists, this research question has only recently been taken up. There are many synonymous codons when coding for proteins, but are they synonymous if they are also coding regulatory instructions?
"Genomes also contain a parallel regulatory code specifying recognition sequences for transcription factors (TFs), and the genetic and regulatory codes have been assumed to operate independently of one another and to be segregated physically into the coding and noncoding genomic compartments. However, the potential for some coding exons to accommodate transcriptional enhancers or splicing signals has long been recognized." (p.1367)
The challenge of the Human Genome Project has given way to searching for an understanding of multiple overlapping genetic codes. (source here)
With the availability of large amounts of genome data, it is possible to test many hypotheses relevant to the functionality of DNA sequences. The data set used is impressive:
"To define intersections between the regulatory and genetic codes, we generated nucleotideresolution maps of TF occupancy in 81 diverse human cell types using genomic deoxyribonuclease I (DNaseI) footprinting. Collectively, we defined 11,598,043 distinct 6- to 40-base pair (bp) footprints genome-wide (~1,018,514 per cell type), 216,304 of which localized completely within protein-coding exons (~24,842 per cell type). Approximately 14% of all human coding bases contact a TF in at least one cell type (average 1.1% per cell type), and 86.9% of genes contained coding TF footprints (average 33% per cell type)." (p.1367)
A summary of the main findings of the research team is provided in a Perspectives essay by Weatheritt and Babu. The hypothesis of two co-existing codes is fully justified by the evidence. According to the press release: "scientists were stunned to discover that genomes use the genetic code to write two separate languages."
"How widespread is the phenomenon of "regulatory" codes that overlap the genetic code, and how do they constrain the evolution of protein sequences? Stergachis et al. address these questions for the transcription factor-binding regulatory code. They use deoxyribonuclease I (DNase I) footprinting to map transcription factor occupancy (a protein bound to DNA can protect that region from enzymatic cleavage) at nucleotide resolution across the human genome in 81 diverse cell types. The authors determined that ~14% of the codons within 86.9% of human genes are occupied by transcription factors. Such regions, called "duons", therefore encode two types of information: one that is interpreted by the genetic code to make proteins and the other, by the transcription factor-binding regulatory code to influence gene expression. This requirement for transcription factors to bind within protein-coding regions of the genome has led to a considerable bias in codon usage and choice of amino acids, in a manner that is constrained by the binding motif of each transcription factor." (p.1325)
Weatheritt and Babu go further. They suggest a general principle: that redundancy in the genetic code opens the door for, not one, but many regulatory codes that can operate within protein-coding regions of the genome. One research question of the future is: how many overlapping codes can be tolerated by the genetic code?
"This "binding" code joins other "regulatory" codes that govern chromatin organization, enhancers, mRNA structure, mRNA splicing, microRNA target sites, translational efficiency, and cotranslational folding, all of which have been proposed to constrain codon choice, and thus protein evolution." (p.1325)
It should be noted that these research findings do not tell us what binding a transcription factor actually achieves. The field of gene regulation is in its infancy. The research team notes that TF binding "may serve multiple functional roles" but that their analysis is "agnostic" to this functionality. Weatheritt and Babu conclude:
"The investigation of overlapping codes opens new vistas on the functional interpretation of variation in coding regions and makes it clear that the story of the genetic code has not yet run its course." (p.1326)
This discussion of genetic codes is only meaningful if it is recognised that the genome is a carrier of complex specified information. The essence of life is not to be found in chemistry, but in the information carried within the cell. Chemicals are used to carry biological information, but the chemicals are not themselves information. The research team recognises this when they say:
"Our results indicate that simultaneous encoding of amino acid and regulatory information within exons is a major functional feature of complex genomes. The information architecture of the received genetic code is optimized for superimposition of additional information and this intrinsic flexibility has been extensively exploited by natural selection." (p.1371-2)
There is a problem with the last few words of the above quotation. The flexible information architecture is said to be exploited "by natural selection", yet this claim has not emerged from a study of evidences. Rather, the theoretical framework of neo-Darwinism provides the context for interpreting the evidences, so that all signs of complexity and functionality are automatically associated with the operation of natural selection. Yet, we have no evidence to show that natural selection can either produce or refine complex specified biological information.
There is a perfectly viable alternative hypothesis to consider: that biological information is evidence for intelligent agency. The evidence we have already about the genetic code is sufficient to make the point, but new evidences of overlapping codes add weight to the hypothesis. The genetic code with redundancy overlaps with other regulatory codes in ways that test the ability of molecular biologists (intelligent agents) to understand what's happening, let alone write overlapping codes of their own as a biomimetic exercise. From time to time, leading biologists get the message, but seem at a loss to drive it forward.
"Any living being possesses an enormous amount of "intelligence", very much more than is necessary to build the most magnificent of cathedrals. Today, this "intelligence" is called "information", but it is still the same thing. It is not programmed as in a computer, but rather it is condensed on a molecular scale in the chromosomal DNA or in that of any other organelle in each cell. This "intelligence" is the sine qua non of life. If absent, no living being is imaginable. Where does it come from? This is a problem which concerns both biologists and philosophers and, at present, science seems incapable of solving it." Pierre Grasse, Evolution of Living Organisms: Evidence for a New Theory of Transformation, (New York: Academic Press, 1977, 2).
The decision to endorse a naturalistic explanation rather than advance agnosticism about the origins of hidden overlapping codes is a pointer to hidden ideologies in origins-science. It seems that as long as materialism/naturalism is presumed, then a great number of unwarranted assertions (usually linked to Darwinism or abiogenesis) go unchallenged in academic papers. As soon as it is pointed out that only intelligent agents write codes, there is an outcry that science is being subverted by religious fundamentalists. However, the converse is true: intelligent design theory is based on the evidence of complex specified information. The evidences for naturalistic alternatives all evaporate under close scrutiny.
Exonic Transcription Factor Binding Directs Codon Choice and Affects Protein Evolution
Andrew B. Stergachis, Eric Haugen, Anthony Shafer, Wenqing Fu, Benjamin Vernot, Alex Reynolds, Anthony Raubitschek, Steven Ziegler, Emily M. LeProust, Joshua M. Akey and John A. Stamatoyannopoulos.
Science, 13 December 2013, 342, 1367-1372 | DOI:10.1126/science.1243490 [pdf here]
Abstract: Genomes contain both a genetic code specifying amino acids and a regulatory code specifying transcription factor (TF) recognition sequences. We used genomic deoxyribonuclease I footprinting to map nucleotide resolution TF occupancy across the human exome in 81 diverse cell types. We found that ~15% of human codons are dual-use codons ("duons") that simultaneously specify both amino acids and TF recognition sites. Duons are highly conserved and have shaped protein evolution, and TF-imposed constraint appears to be a major driver of codon usage bias. Conversely, the regulatory code has been selectively depleted of TFs that recognize stop codons. More than 17% of single-nucleotide variants within duons directly alter TF binding. Pervasive dual encoding of amino acid and regulatory information appears to be a fundamental feature of genome evolution.
Weatheritt, R.J. and Babu, M, M. The Hidden Codes That Shape Protein Evolution,
Science, 13 December 2013, 342, 1325-1326 | DOI: 10.1126/science.1248425
Klinghoffer, D. Genome Uses Two Languages Simultaneously; Try That Yourself Sometime, Why Don't You, Evolution News & Views (December 13, 2013)
Luskin, C. Codes Within Codes: How Dual-Use Codons Challenge Statistical Methods for Inferring Natural Selection, Evolution News & Views (December 20, 2013)
This Discovery Institute seminar will prepare students to make research contributions advancing the growing science of intelligent design (ID). The seminar will explore cutting-edge ID work in fields such as molecular biology, biochemistry, embryology, developmental biology, paleontology, computational biology, ID-theoretic mathematics, cosmology, physics, and the history and philosophy of science. This seminar is open to students who intend to pursue graduate studies in the natural sciences or the philosophy of science. Applicants must be college juniors or seniors or already in graduate school.
Quantum phenomena in biology are receiving the attention of more and more researchers, with photosynthesis being the process getting the most attention. Back in 2007, it was apparent that quantum effects were effective for "explaining the extreme efficiency of photosynthesis". Then, in 2010, the photosynthetic apparatus of cryptophyte algae was the focus of research, because its pigments are farther apart than was expected for efficient functioning. In a News & Views article in Nature, van Grondelle & Novoderezhkin discussed evidence suggesting that a process known as quantum coherence is part of the explanation. They added: "This is the first time that this phenomenon has been observed in photosynthetic proteins at room temperature, rather than at much lower temperatures, bolstering the idea that quantum coherence influences light harvesting in vivo." The most recent study has provided a theoretical argument that quantum effects must be present and that classical physics does not provide an explanation. It is claimed to be "the first unambiguous theoretical evidence of quantum effects in photosynthesis". The Press Release describes the work in this way:
"Often, to observe or exploit quantum mechanical phenomena, systems need to be cooled to very low temperatures. This however does not seem to be the case in some biological systems, which display quantum properties even at ambient temperatures. Now, a team at UCL have attempted to identify features in these biological systems which can only be predicted by quantum physics, and for which no classical analogues exist. "Energy transfer in light-harvesting macromolecules is assisted by specific vibrational motions of the chromophores," said Alexandra Olaya-Castro (UCL Physics & Astronomy), supervisor and co-author of the research. "We found that the properties of some of the chromophore vibrations that assist energy transfer during photosynthesis can never be described with classical laws, and moreover, this non-classical behaviour enhances the efficiency of the energy transfer."" (Source here)
New frontiers for understanding the natural world (image source here)
Most light-gathering macromolecules are composed of chromophores (the light-absorbing pigments) attached to proteins. These are responsible for the first step of photosynthesis, which is to capture light and transfer its energy to another system that can store it. Earlier work showed that energy is transferred in a wave-like manner (the quantum coherence model). However, theoreticians were of the opinion that classical physics could still find a way of explaining the observations.
"Molecular vibrations are periodic motions of the atoms in a molecule, like the motion of a mass attached to a spring. When the energy of a collective vibration of two chromophores matches the energy difference between the electronic transitions of these chromophores a resonance occurs and efficient energy exchange between electronic and vibrational degrees of freedom takes place. Providing that the energy associated to the vibration is higher than the temperature scale, only a discrete unit or quantum of energy is exchanged. Consequently, as energy is transferred from one chromophore to the other, the collective vibration displays properties that have no classical counterpart. The UCL team found the unambiguous signature of non-classicality is given by a negative joint probability of finding the chromophores with certain relative positions and momenta. In classical physics, probability distributions are always positive." (Source here)
Bear in mind that considerable resources have already been spent on trying to develop a biomimetic system that captures solar energy like plants - only to find that photosynthesis is extraordinarily complex and the research has not yet delivered any commercial outputs. It is a reminder that the Darwinian vision of ultimate simplicity has been repeatedly falsified. Photosynthesising microorganisms are among the earliest to appear in the Precambrian fossil record - and yet these organisms have chemical and physical pathways that are only beginning to be understood within the research community. What is emerging are processes and structures that carry the hallmarks of design, with complex specified information at every level of analysis. We are at the beginning of a journey into quantum effects in biology. It is the design paradigm that is best equipped to guide our thoughts and keep us on the right path.
"Other biomolecular processes such as the transfer of electrons within macromolecules (like in reaction centres in photosynthetic systems), the structural change of a chromophore upon absorption of photons (like in vision processes) or the recognition of a molecule by another (as in olfaction processes), are influenced by specific vibrational motions. The results of this research therefore suggest that a closer examination of the vibrational dynamics involved in these processes could provide other biological prototypes exploiting truly non-classical phenomena." (Source here)
Non-classicality of the molecular vibrations assisting exciton energy transfer at room temperature
Edward J. O'Reilly & Alexandra Olaya-Castro
Nature Communications, 9 January 2014, 5, Article number:3012 | doi:10.1038/ncomms4012
Abstract: Advancing the debate on quantum effects in light-initiated reactions in biology requires clear identification of non-classical features that these processes can exhibit and utilize. Here we show that in prototype dimers present in a variety of photosynthetic antennae, efficient vibration-assisted energy transfer in the sub-picosecond timescale and at room temperature can manifest and benefit from non-classical fluctuations of collective pigment motions. Non-classicality of initially thermalized vibrations is induced via coherent exciton-vibration interactions and is unambiguously indicated by negativities in the phase-space quasi-probability distribution of the effective collective mode coupled to the electronic dynamics. These quantum effects can be prompted upon incoherent input of excitation. Our results therefore suggest that investigation of the non-classical properties of vibrational motions assisting excitation and charge transport, photoreception and chemical sensing processes could be a touchstone for revealing a role for non-trivial quantum phenomena in biology.
Cartwright, J. Quantized vibrations are essential to photosynthesis, say physicists, physicsworld.com (22 January 2014)
Tyler, D. Explaining the extreme efficiency of photosynthesis. ARN Literature Blog (16 April 2007)
Tyler, D. The latest thinking on how photosynthesis evolved. ARN Literature Blog (11 February 2007)
All ten right HERE.
A great article by Tom Bethell in ENV...
From Nature Magazine...If our planet were just a little closer to the Sun, a runaway greenhouse effect would render it unliveable, a climate model suggests. The simulation, which helps to define the inner edge of a star system?s 'habitable zone', drastically reduces the fraction of Sun-like stars that might harbour a rocky planet suitable for life, according to some scientists. But others note that the model, although detailed, might be too restrictive because it applies only to Earth-like planets on which water is abundant.
Of course we are also asked to believe that the right planet, just at the right distance from its star, will naturally generate life from non-living chemicals...
What happened here is that a group of adults can?t have a class taught by a qualified person on a topic that interests them in a suitable public venue because an individual is allowed to shut it down?just by scaring people by making a scene.