by Nancy Pearcey
At Stanford University in the spring of 2005, I had my first experience of being picketed. Organized by a campus group calling itself Rational Thought, the picketers carried signs protesting the presence of Intelligent Design (ID) proponents on campus. Several local atheist groups joined the controversy, sparking colorful stories in the local newspapers.
Before me at the podium was Michael Behe, author of Darwin’s Black Box, speaking on the scientific evidence against evolution. I followed by explaining the cultural and philosophical implications of evolution. As I spoke, astonishingly, some of the protesters softened their hostility and actually began to engage with what I was saying. The gist of my talk was that Darwinism undercuts the very possibility of rational truth–an argument that seemed unsettling to atheist students who had organized a group specifically to promote rational thought!
To understand how Darwinism undercuts the very concept of rationality, we can think back to the late nineteenth century when the theory first arrived on American shores. Almost immediately, it was welcomed by a group of thinkers who began to work out its implications far beyond science. They realized that Darwinism implies a broader philosophy of naturalism (i.e., that nature is all that exists, and that natural causes are adequate to explain all phenomena). Thus they began applying a naturalistic worldview across the board–in philosophy, psychology, the law, education, and the arts.
At the foundation of these efforts, however, was a naturalistic approach to knowledge itself (epistemology). The logic went like this: If humans are products of Darwinian natural selection, that obviously includes the human brain–which in turn means all our beliefs and values are products of evolutionary forces: Ideas arise in the human brain by chance, just like Darwin’s chance variations in nature; and the ones that stick around to become firm beliefs and convictions are those that give an advantage in the struggle for survival. This view of knowledge came to be called pragmatism (truth is what works) or instrumentalism (ideas are merely tools for survival).
One of the leading pragmatists was John Dewey, who had a greater influence on educational theory in America than anyone else in the 20th century. Dewey rejected the idea that there is a transcendent element in human nature, typically defined in terms of mind or soul or spirit, capable of knowing a transcendent truth or moral order. Instead he treated humans as mere organisms adapting to challenges in the environment. In his educational theory, learning is just another form of adaptation–a kind of mental natural selection. Ideas evolve as tools for survival, no different from the evolution of the lion’s teeth or the eagle’s claws.
In a famous essay called “The Influence of Darwin on Philosophy,” Dewey said Darwinism leads to a “new logic to apply to mind and morals and life.” In this new evolutionary logic, ideas are not judged by a transcendent standard of Truth, but by how they work in getting us what we want. Ideas do not “reflect reality” but only serve human interests.
To emphasize how revolutionary this was, up until this time the dominant theory of knowledge or epistemology was based on the biblical doctrine of the image of God. Confidence in the reliability of human knowledge derived from the conviction that finite human reason reflects (to some degree at least) an infinite divine Reason. Since the same God who created the universe also created our minds, we can be confident that our mental capacities reflect the structure of the universe. In The Mind of God and the Works of Man, Edward Craig shows that even as Western thinkers began to move away from orthodox Christian theology, in their philosophy most of them still retained the conception that our minds reflect an Absolute Mind as the basis for trust in human cognition.
The pragmatists were among the first, however, to face squarely the implications of naturalistic evolution. If evolutionary forces produced the mind, they said, then all are beliefs and convictions are nothing but mental survival strategies, to be judged in terms of their practical success in human conduct. William James liked to say that truth is the “cash value” of an idea: If it pays off, then we call it true.
This Darwinian logic continues to shape American thought more than we might imagine. Take religion. William James was raised in a household with an intense interest in religion. (In the Second Great Awakening his father converted to Christianity, then later converted to Swedenborgianism). As a result, James applied his philosophy of pragmatism to religion: We decide whether or not God exists depending whether that belief has positive consequences in our experience. “An idea is ‘true’ so long as to believe it is profitable to our lives,” James wrote in What Pragmatism Means. Thus “if theological ideas prove to have a value for concrete life, they will be true.”
Does this sound familiar? A great many Americans today choose their religion based on what meets their needs, or “affirms” them, or helps them cope more effectively with personal issues, from losing weight to building a better marriage. I was recently chatting with a Christian who is very active in her church; but when the topic turned to a mutual friend who is not a believer, her response was, “Well, whatever works for you.” Of course, there is a grave problem with choosing a religion according to “whatever works for you”–namely, that we cannot know whether it is really true or just a projection of our own needs. As Lutheran theologian John Warwick Montgomery puts it, “Truths do not always ‘work’, and beliefs that ‘work’ are by no means always true.”
If James’s religious pragmatism has become virtually the American approach to spirituality today, then Dewey’s pragmatism has become the preferred approach to education. Virtually across the curriculum–from math class to moral education–teachers are trained to be nondirective “facilitators,” presenting students with problems and allowing them to work out their own pragmatic strategies for solving them. Of course, good teachers have always taught students to think for themselves. But today’s nondirective methodologies go far beyond that. They springboard from a Darwinian epistemology that denies the very existence of any objective or transcendent truth.
Take, for example, “constructivism,” a popular trend in education today. Few realize that it is based on the idea that truth is nothing more than a social construction for solving problems. A leading theorist of constructivism, Ernst von Glasersfeld at the University of Georgia, is forthright about its Darwinian roots. “The function of cognition is adaptive in the biological sense,” he writes. “This means that ‘to know’ is not to possess ‘true representations’ of reality, but rather to possess ways and means of acting and thinking that allow one to attain the goals one happens to have chosen.” In short, a Darwinian epistemology implies that ideas are merely tools for meeting human goals.
These results of pragmatism are quite postmodern, so it comes as no surprise to learn that the prominent postmodernist Richard Rorty calls himself a neo-pragmatism. Rorty argues that postmodernism is simply the logical outcome of pragmatism, and explains why.
According to the traditional, common-sense approach to knowledge, our ideas are true when the represent or correspond to reality. But according to Darwinian epistemology, ideas are nothing but tools that have evolved to help us control and manipulate the environment. As Rorty puts it, our theories “have no more of a representational relation to an intrinsic nature of things than does the anteater’s snout or the bowerbird’s skill at weaving” (Truth and Progress). Thus we evaluate an idea the same way that natural selection preserves the snout or the weaving instinct–not by asking how well it represents objective reality but only how well it works.
I once presented this progression from Darwinism to postmodern pragmatism at a Christian college, when a man in the audience raised his hand: “I have only one question. These guys who think all our ideas and beliefs evolved . . . do they think their own ideas evolved?” The audience broke into delighted applause, because of course he had captured the key fallacy of the Darwinian approach to knowledge. If all ideas are products of evolution, and thus not really true but only useful for survival, then evolution itself is not true either–and why should the rest of us pay any attention to it?
Indeed, the theory undercuts itself. For if evolution is true, then it is not true, but only useful. This kind of internal contradiction is fatal, for a theory that asserts something and denies it at the same time is simply nonsense. In short, naturalistic evolution is self-refuting.
Clash of Worldviews
The media paints the evolution controversy in terms of science versus religion. But it is much more accurate to say it is worldview versus worldview, philosophy versus philosophy. Making this point levels the playing field and opens the door to serious dialogue.
Interestingly, a few evolutionists do acknowledge the point. Michael Ruse made a famous admission at the 1993 symposium of the American Association for the Advancement of Science. “Evolution as a scientific theory makes a commitment to naturalism,” he said–that is, it is a philosophy, not just facts. He went on: “Evolution . . . akin to religion, involves making certain a priori or metaphysical assumptions, which at some level cannot be proven empirically.” Ruse’s colleagues responded with shocked silence and afterward one of them, Arthur Shapiro, wrote a commentary titled, “Did Michael Ruse Give Away the Store?”
But, ironically, in the process, Shapiro himself conceded that “there is an irreducible core of ideological assumptions underlying science.” He went on: “Darwinism is a philosophical preference, if by that we mean we choose to discuss the material Universe in terms of material processes accessible by material operations.”
It is this worldview dimension that makes the debate over Darwin versus Intelligent Design so important. Every system of thought starts with a creation account that offers an answer to the fundamental question: Where did everything come from? That crucial starting point shapes everything that follows. Today a naturalistic approach to knowledge is being applied to virtually every field. Some say we’re entering an age of “Universal Darwinism,” where it is no longer just a scientific theory but a comprehensive worldview.
It has become a commonplace to say that America is embroiled in a “culture war” over conflicting moral standards. But we must remember that morality is always derivative, stemming from an underlying worldview. The culture war reflects an underlying cognitive war over worldviews–and at the core of each worldview is an account of origins.
Published March 30, 2016