The origin of life

Darwin’s theory of evolution by natural selection deals with the question of how life evolves and does not directly address the question of the origin of life itself. The fields of cosmology and physics and chemistry have provided models of how the universe evolved and created the solar system, among other things. But those theories do not explain how organic molecules, the basic building blocks of life, came about.

An article by Gareth Cook in the August 14, 2005 issue of the Boston Globe examined this question in the light of an initiative (known as the ”Origins of Life in the Universe Initiative”) by then Harvard president Lawrence Summers to invest millions to investigate this important question, partly in an effort to have Harvard try and catch up the leaders in this field at the University of Arizona, the California Institute of Technology, and the Scripps Research Institute in La Jolla, Calif..
[Read more…]

What the neuroscience community thinks about the mind/brain relationship

The idea that the mind is purely a product of the material in the brain has profound consequences for religious beliefs, which depend on the idea of the mind as an independent controlling force. The very concept of ‘faith’ implies an act of free will. So the person who believes in a god is pretty much forced to reject the idea that the mind is purely a creation of the brain. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Pope John Paul II struck a similar theme in a 1996 address focusing on science, in which he said theories of evolution that “consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person.”

As I wrote yesterday, the flagging intelligent design creationism (IDC) movement seems to be hoping for some fresh energy to emerge from the work of psychiatric researcher Dr. Schwartz. Or at the very least they may be hoping that they can persuade the public that the mind does exist independently of the brain. But they are going to have a hard time getting traction for this idea within the neurobiology community. There seems to be a greater degree of unanimity among them about the material basis of the mind than there is among biologists about the sufficiency of natural selection.

Stephen F. Heinemann, president of the Society for Neuroscience and a professor in the molecular-neurobiology lab at the Salk Institute for Biological Studies, in La Jolla, Calif., echoed many scientists’ reactions when he said in an e-mail message, “I think the concept of the mind outside the brain is absurd.”

But the ability of the neurobiology community to do their work unfettered by religious scrutiny may be coming to an end as increasing numbers of people become aware of the consequences of accepting the idea that the mind is purely a product of the brain. People might reject this idea (and be attracted to the work of Dr. Schwartz), not because they have examined and rejected the scientific evidence in support of it, but because it threatens their religious views. As I discussed in an earlier posting, people who want to preserve a belief system will accept almost any evidence, however slender or dubious, if it seems to provide them with an option of retaining it. As the article says:

Though Dr. Schwartz’s theory has not won over many scientists, some neurobiologists worry that this kind of argument might resonate with the general public, for whom the concept of a soul, free will, and God seems to require something beyond the physical brain. “The truly radical and still maturing view in the neuroscience community that the mind is entirely the product of the brain presents the ultimate challenge to nearly all religions,” wrote Kenneth S. Kosik, a professor of neuroscience research at the University of California at Santa Barbara, in a letter to the journal Nature in January.
. . .
Dr. Kosik argues that the topic of the mind has the potential to cause much more conflict between scientists and the general public than does the issue of evolution. Many people of faith can easily accept the tenets of Darwinian evolution, but it is much harder for them to swallow the assumption of a mind that arises solely from the brain, he says. That issue he calls a “potential eruption.”

When researchers study the nature of consciousness, they find nothing that persuades them that the mind is anything but a product of the brain.

The reigning paradigm among researchers reduces every mental experience to the level of cross talk between neurons in our brains. From the perspective of mainstream science, the electrical and chemical communication among nerve cells gives rise to every thought, whether we are savoring a cup of coffee or contemplating the ineffable.
. . .
Mr. [Christof] Koch [a professor of cognitive and behavioral biology at the California Institute of Technology] collaborated for nearly two decades with the late Francis Crick, the co-discoverer of DNA’s structure, to produce a framework for understanding consciousness. The key, he says, is to look for the neural correlates of consciousness – the specific patterns of brain activity that correspond to particular conscious perceptions. Like Crick, Mr. Koch follows a strictly materialist paradigm that nerve interactions are responsible for mental states. In other words, he says, “no matter, never mind.”

Crick summed up the materialist theory in The Astonishing Hypothesis: The Scientific Search for the Soul (Scribner, 1994). He described that hypothesis as the idea that “your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.”

What many people may find ‘astonishing’ about Crick’s hypothesis is that among neurobiologists it is anything but astonishing. It is simply taken for granted as the way things are. Is it surprising that religious believers find such a conclusion unsettling?

Next: What does “free will” mean at a microscopic level?

POST SCRIPT: Why invading Iraq was morally and legally wrong

Jacob G. Hornberger, founder and president of The Future of Freedom Foundation has written a powerful essay that lays out very clearly the case of why the US invasion and occupation of Iraq is morally and legally indefensible, and why it has inevitably led to the atrocities that we are seeing there now, where reports are increasingly emerging of civilians being killed by US forces. Hornberger writes “I do know one thing: killing Iraqi children and other such “collateral damage” has long been acceptable and even “worth it” to U.S. officials as part of their long-time foreign policy toward Iraq.”

The article is well worth reading.

IDC gets on board the brain train

An article titled Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) examined what neuroscientists are discovering about religion and the brain. It is a curious article. The author (Richard Monastersky) seems to be trying very hard to find evidence in support of the idea that brain research is pointing to the independent existence of a soul/mind, but it is clear on reading it that he comes up short and that there is no such evidence, only the hopes of a very small minority of scientists.
[Read more…]

Religion’s last stand-2: The role of Descartes

In the previous posting, I discussed two competing models of the mind/brain relationship.

It seems to me that the first model, where the physical brain is all there is and the mind is simply the creation of the brain, is the most persuasive one since it is the simplest and accepting it involves no further complications. In this model, our bodies are purely material things, with the brain’s workings enabling us to think, speak, reason, act, and so forth. The idea of ‘free will’ is an illusion due to the brain being an enormously complicated system whose processes and end results cannot be predicted. (A good analogy would be classically chaotic systems like the weather. Because of the specific non-linearity of the equations governing weather, we cannot predict long-term weather even though the system is a deterministic and materialistic.)

The second model, that of an independently existing non-material mind/soul, separate from the brain and directing the brain, immediately raises all kinds of problems, which have long been recognized. The scientist-philosopher Rene Descartes (1596-1650) of “I think, therefore I am” fame was perhaps the first person to formulate this mind-body dualism (or at least he is the person most closely associated with the idea) and it is clear that he felt that it was necessary to adopt this second model if one was to retain a belief in god.

But he realized immediately that it raises the problem of how the non-material mind/soul can interact with the material brain/body to get it to do things. Princess Elizabeth of Bohemia, with whom Descartes had an extended correspondence, was unable to understand Descartes’ explanation of this interaction and kept prodding him on this very question. Descartes had no adequate answer for her, even though both clearly wanted to believe in the existence of god and the soul. In the introduction to his translation of Descartes’ Meditations and other Metaphysical Writings (which contains extended segments of the Elizabeth-Descartes correspondence), Desmond Clarke writes:

After repeated attempts to answer the question, how is it possible for something which is not physical to interact with something else which, by definition is not physical?, Descartes concedes that he cannot explain how it is possible.

But he tried, using the best scientific knowledge available to him at that time. He argued that the location of the soul’s interaction with the body occurred in the pineal gland.

As is well known, Descartes chose the pineal gland because it appeared to him to be the only organ in the brain that was not bilaterally duplicated and because he believed, erroneously, that it was uniquely human. . . By localizing the soul’s contact with body in the pineal gland, Descartes had raised the question of the relationship of mind to the brain and nervous system. Yet at the same time, by drawing a radical ontological distinction between body as extended and mind as pure thought, Descartes, in search of certitude, had paradoxically created intellectual chaos.

Although Descartes failed in his efforts to convincingly demonstrate the independent existence of the soul, research into the relationship of religious beliefs to the central nervous system of the brain has continued.

Descartes is an interesting character. Much of his scientific work, and even his temperament, seem to indicate a materialistic outlook. But at the same time, he took great pains to try and find proofs of god’s existence. One gets the sense that he was a person trying to convince himself of something he did not quite believe in, and had he lived in a different time might have rejected god with some relief. The article on Descartes in Encyclopaedia Britannica Online, 13 June 2006 says:

Even during Descartes’s lifetime there were questions about whether he was a Catholic apologist, primarily concerned with supporting Christian doctrine, or an atheist, concerned only with protecting himself with pious sentiments while establishing a deterministic, mechanistic, and materialistic physics.

The article points to reasons for the ambiguousness of his views, which could be due to the fact that there was, at that time, considerable fear of the power of the Catholic Church and this may have guided the way he presented his work.

In 1633, just as he was about to publish The World (1664), Descartes learned that the Italian astronomer Galileo Galilei (1564–1642) had been condemned in Rome for publishing the view that the Earth revolves around the Sun. Because this Copernican position is central to his cosmology and physics, Descartes suppressed The World, hoping that eventually the church would retract its condemnation. Although Descartes feared the church, he also hoped that his physics would one day replace that of Aristotle in church doctrine and be taught in Catholic schools.

Descartes definitely comes across as somewhat less than pious, and non-traditional in his religious beliefs.

Descartes himself said that good sense is destroyed when one thinks too much of God. He once told a German protégée, Anna Maria van Schurman (1607–78), who was known as a painter and a poet, that she was wasting her intellect studying Hebrew and theology. He also was perfectly aware of – though he tried to conceal – the atheistic potential of his materialist physics and physiology. Descartes seemed indifferent to the emotional depths of religion. Whereas Pascal trembled when he looked into the infinite universe and perceived the puniness and misery of man, Descartes exulted in the power of human reason to understand the cosmos and to promote happiness, and he rejected the view that human beings are essentially miserable and sinful. He held that it is impertinent to pray to God to change things. Instead, when we cannot change the world, we must change ourselves.

Clearly he was not orthodox in his thinking. Although he tried to believe in god, it was his emphasis on applying the materialistic principles that he used in his scientific work to try and identify the mechanism by which the mind interacts with the brain that has the potential to create the big problem for religion.

To sum up Descartes’ argument, following sound scientific (methodological naturalistic) principles, he felt that if the mind interacted with the brain, then there had to be (1) some mechanism by which the non-material mind could influence the material brain, and (2) some place where this interaction took place. Although he could not satisfactorily answer the first question, he at least postulated a location for the interaction, the pineal gland. We know now that that is wrong, but the questions he raised are still valid and interesting ones that go to the heart of religion.

Next: What current researchers are finding about the brain and religion.

POST SCRIPT: Documentary on Rajini Rajasingham-Thiranagama

I have written before about the murder of my friend Rajini Rajasingham-Thiranagama, who had been an active and outspoken campaigner for human rights in Sri Lanka. I have learned that a documentary about her life called No More Tears Sister is the opening program in the 2006 PBS series P.O.V.

In the Cleveland area, the program is being shown on Friday, June 30, 2006 at 10:00pm on WVIZ 25. Airing dates vary by location, with some PBS stations showing it as early as June 27. The link above gives program listings for other cities. The synopsis on the website says:

If love is the first inspiration of a social revolutionary, as has sometimes been said, no one better exemplified that idea than Dr. Rajani Thiranagama. Love for her people and her newly independent nation, and empathy for the oppressed of Sri Lanka – including women and the poor – led her to risk her middle-class life to join the struggle for equality and justice for all. Love led her to marry across ethnic and class lines. In the face of a brutal government crackdown on her Tamil people, love led her to help the guerrilla Tamil Tigers, the only force seemingly able to defend the people. When she realized the Tigers were more a murderous gang than a revolutionary force, love led her to break with them, publicly and dangerously. Love then led her from a fulfilling professional life in exile back to her hometown of Jaffna and to civil war, during which her human-rights advocacy made her a target for everyone with a gun. She was killed on September 21, 1989 at the age of 35.

As beautifully portrayed in Canadian filmmaker Helene Klodawsky’s “No More Tears Sister,” kicking off the 19th season of public television’s P.O.V. series, Rajani Thiranagama’s life is emblematic of generations of postcolonial leftist revolutionaries whose hopes for a future that combined national sovereignty with progressive ideas of equality and justice have been dashed by civil war – often between religious and ethnic groups, and often between repressive governments and criminal rebel gangs. Speaking out for the first time in the 15 years since Rajani Thiranagama’s assassination, those who knew her best talk about the person she was and the sequence of events that led to her murder. Especially moving are the memories of Rajani’s older sister, Nirmala Rajasingam, with whom she shared a happy childhood, a political awakening and a lifelong dedication to fighting injustice; and her husband, Dayapala Thiranagama, who was everything a middle-class Tamil family might reject – a Sinhalese radical student from an impoverished rural background. Also included are the recollections of Rajani’s younger sisters, Vasuki and Sumathy; her parents; her daughters, Narmada and Sharika; and fellow human-rights activists who came out of hiding to tell her story. The film rounds out its portrayal with rare archival footage, personal photographs and re-enactments in which Rajani is portrayed by daughter Sharika Thiranagama. The film is narrated by Michael Ondaatje, esteemed author of The English Patient and Anil’s Ghost.

I knew Rajini well. We were active members of the Student Christian Movement in Sri Lanka when we were both undergraduates at the University of Colombo. It does not surprise me in the least that she threw herself with passion into the struggle for justice. She was brave and spoke the truth, even when it was unpalatable to those in power and with guns, and backed up her words with actions, thus putting her life on the line for her beliefs. Such people are rare. I am proud to have known her.

The desire for belief preservation.

In the previous post we saw how human beings are believed to not be natural critical thinkers, preferring instead to believe in the first plausible explanation for anything that comes along, not seeing these initial explanations as merely hypotheses to be evaluated against competing hypotheses.

But one might think that when we are exposed to alternative hypotheses, we might then shift gears into a critical mode. But Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) argues that what foils this is the human desire for belief preservation.

He quotes seventeenth century philosopher Francis Bacon who said:

The mind of man is far from the nature of a clear and equal glass, wherein the beams of things should reflect according to their true incidence; nay, it is rather like an enchanted glass, full of superstition and imposture, if it be not delivered and reduced.

In other words, van Gelder says, “the mind has intrinsic tendencies toward illusion, distortion, and error.” These arise from a combination of being hard-wired in our brains (because of evolution), natural growth of our brains as we grow up in the Earth’s environment, and the influence of our societies and cultures. “Yet, whatever their origin, they are universal and ineradicable features of our cognitive machinery, usually operating quite invisibly to corrupt our thinking and contaminate our beliefs.”

All these things lead us to have cognitive biases and blind spots that prevent us from seeing things more clearly, and one of the major blind spots is that of belief preservation. van Gelder says that “At root, belief preservation is the tendency to make evidence subservient to belief, rather than the other way around. Put another way, it is the tendency to use evidence to preserve our opinions rather than guide them.”

van Gelder says that when we strongly believe some thing or desire it to be true, we tend to do three things: “1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender.”

This would explain why (as vividly demonstrated in the popular video A Private Universe) people hold on to their erroneous explanations about the phases of the moon even after they have been formally instructed in school about the correct explanation.

This would also explain the question that started these musings: Why for so long had I not applied the same kinds of questioning to my religious beliefs concerning god, heaven, etc. that I routinely applied to other areas of my life? The answer is that since I grew up in a religious environment and accepted the existence of god as plausible, I did not seek other explanations. Any evidence in favor of belief (the sense of emotional upliftment that sometimes occurs during religious services or private prayer, or some event that could be interpreted to indicate god’s action in my life or in the world, or scientific evidence that supported a statement in the Bible) was seized on, while counter evidence (such a massive death and destruction caused by human or natural events, personal misfortunes or tragedies, or scientific discoveries that contradicted Biblical texts) was either ignored or explained away. It was only after I had abandoned my belief in god’s existence that I was able to ask the kinds of questions that I had hitherto avoided.

Did I give up my belief because I could not satisfactorily answer the difficult questions concerning god? Or did I start asking those questions only after I had given up belief in god? In some sense this is a chicken-and-egg problem. Looking back, it is hard to say. Probably it was a little of both. Once I started taking some doubts seriously and started questioning, this probably led to more doubts, more questions, until finally the religious edifice that I had hitherto believed in just collapsed.

In the series of posts dealing with the burden of proof concerning the existence of god, I suggested that if we use the common yardsticks of law or science, then that would require that the burden of proof lies with the person postulating the existence of any entity (whether it be god or a neutrino or whatever), and that in the absence of positive evidence in favor of existence, the default assumption is to assume the non-existence of the entity.

In a comment to one of those postings, Paul Jarc suggested that the burden of proof actually lay with the person trying to convince the other person to change his views. It may be that we are both right. What I was describing was the way that I thought things should be, while Paul was describing the way things are in actual life, due to the tendency of human beings to believe the first thing that sounds right and makes intuitive sense, coupled with the desire to preserve strong beliefs once formed.

van Gelder ends up his article with some good advice:

Belief preservation strikes right at the heart of our general processes of rational deliberation. The ideal critical thinker is aware of the phenomenon, actively monitors her thinking to detect its pernicious influence, and deploys compensatory strategies.

Thus, the ideal critical thinker
• puts extra effort into searching for and attending to evidence that contradicts what she currently believes;
• when “weighing up” the arguments for and against, gives some “extra credit” for those arguments that go against her position; and
• cultivates a willingness to change her mind when the evidence starts mounting against her.

Activities like these do not come easily. Indeed, following these strategies often feels quite perverse. However, they are there for self-protection; they can help you protect your own beliefs against your tendency to self-deception, a bias that is your automatic inheritance as a human being. As Richard Feynman said, “The first principle is that you must not fool yourself – and you are the easiest person to fool.”

The practice of science requires us to routinely think this way. But it is not easy to do and even scientists find it hard to give up their cherished theories in the face of contrary evidence. But because scientific practice requires this kind of thinking, this may also be why science is perceived as ‘hard’ by the general public. Not because of its technical difficulties, but because you are constantly being asked to give up beliefs that seem so naturally true and intuitively obvious.

POST SCRIPT: The people who pay the cost of war

I have nothing to add to this powerful short video, set to the tune of Johnny Cash singing Hurt. Just watch. (Thanks to Jesus’ General.)

Seeing the world through Darwin’s eyes

It is good to be back and blogging again!

On my trip to Australia, I had the chance to see some of the marsupial animals that are native to that continent, and as I gazed at these strange and wondrous creatures, I asked myself the same question that all visitors to the continent before me must have asked: Why are these animals so different from the ones I am familiar with? After all, Australia’s environment is not that different from that found in other parts of the world, but the fact that most marsupials (like kangaroos, wallabies, koalas, and wombats) are found only on that continent is remarkable. I was stunned to learn that when a kangaroo is born, it weighs less than one gram. This is because much of the development of the newborn (which occurs in other animals inside the womb of the mother) takes place in the pouch for marsupials.
[Read more…]

Why scientific theories are more than explanations

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

At its heart, intelligent design creationism (IDC) advocates adopt as their main strategy that of finding phenomena that are not (at least in their eyes) satisfactorily explained by evolutionary theory and arguing that hence natural selection is a failed theory. They say that adding the postulate of an ‘intelligent designer’ (which is clearly a pseudonym for God) as the cause of these so-called unexplained phenomena means that they are no longer unexplained. This, they claim, makes IDC the better ‘explanation.’ Some (perhaps for tactical reasons) do not go so far and instead say that it is at least a competing explanation and thus on a par with evolution.
[Read more…]

Why IDC is not science

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In the previous posting, I pointed out that if one looks back at the history of science, all the theories that are considered to be science are both (1) naturalistic and (2) predictive. Thus these two things constitute necessary conditions.

This is an important fact to realize when so-called intelligent design creationism (IDC) advocates argue that theirs is a ‘scientific’ theory. If so, the first hurdle IDC must surmount is that it meet both those necessary criteria, if it is to be even eligible to be considered to be science. It has to be emphasized that meeting those conditions is not sufficient, for something to be considered science, but the question of sufficiency does not even arise because IDC does not meet either of the two necessary conditions.

I issued this challenge to the IDC proponents when I debated them in Kansas in 2002. I pointed out that nowhere did they provide any kind of mechanism that enabled them to predict anything that anyone could go out and look for. And they still haven’t. At its essence, IDC strategy is to (1) point to a few things that they claim evolutionary theory cannot explain; (2) assert that such phenomena have too low a probability to be explained by any naturalistic theory; and (3) draw the conclusion that those phenomena must have been caused by an ‘unspecified designer’ (with a nudge, nudge, wink, wink to the faithful that this is really God) whose workings are beyond the realm of the natural world explored by science.

Thus they postulate a non-natural cause for those phenomena and cannot predict any thing that any person could go and look for. (This is not surprising. The designer is, for all intents and purposes, a synonym for God and it would be a bit bizarre to our traditional concept of God to think that his/her actions should be as predictable as that of blocks sliding down inclined planes.) When I asked one of the IDC stalwarts (Jonathan Wells) during my visit to Hillsdale College for an IDC prediction, the best he could come up with was that there would be more unexplained phenomena in the future or words to that effect.

But that is hardly what is meant by a scientific prediction. I can make that same kind of vague prediction about any theory, even a commonly accepted scientific one since no theory ever explains everything. A scientific prediction takes the more concrete form: “The theory Z encompassing this range of phenomena predicts that if conditions X are met, then we should be able to see result Y.”

IDC advocates know that their model comes nowhere close to meeting this basic condition of science. So they have adopted the strategy of: (1) challenging the naturalism condition, arguing that it is not a necessary condition for science and that it has been specifically and unfairly adopted to exclude IDC from science; and (2) tried to create a new definition of science so that IDC can be included. This takes the form of arguing that a scientific theory is one that ‘explains’ phenomena.

There are variations and expansions on these arguments by the various members of the IDC camp but I have tried to reduce it to its skeletal elements. These variations that IDC proponents adopt are designed to blur the issues but are easy to refute. See this cartoon by Tom Tomorrow (thanks to Daniel for the link) and this cartoon (thanks to Heidi) and this funny post by Canadian Cynic about the possible consequences of using IDC-type reasoning in other areas of life.)

The rejection by IDC advocates of naturalism and predictivity as necessary conditions for science goes against the history of science. Recall for example the struggle between the Platonic and Copernican models of the universe. Remember that both sides of this debate involved religious believers. But when they tried to explain the motions of the planets, both sides used naturalistic theories. To explain the retrograde motion of Mercury and other seemingly aberrant behavior, they invoked epicycles and the like. They struggled hard to find models that would enable them to predict future motion. They did not invoke God by saying things like “God must be moving the planets backwards on occasion.” Or “This seemingly anomalous motion of Mercury is due to God.” Such an explanation would not have been of any use to them because allowing God into the picture would preclude the making of predictions.

In fact, the telling piece of evidence that ended the geocentric model was that the Rudolphine Tables using Kepler’s elliptical orbits and a heliocentric model were far superior to any alternative in predicting planetary motion.

While it may be true that the underlying beliefs that drove people of that time to support the Platonic or Copernican model may have been influenced by their religious outlook, they did not seem to invoke God in a piecemeal way, as an explanation for this or that isolated phenomenon, as is currently done by IDC advocates. Instead they were more concerned with posing the question of whether the whole structure of the scientific theory was consistent with their understanding of the working of God. In other words, they were debating whether a geocentric model was compatible with their ideas of God’s role in the world. The detailed motions of specific planets, however problematic, seemed to have been too trivial for them to invoke God as an explanation, although they would probably not have excluded this option as something that God was capable of doing.

It may also well be true that some scientists of that time thought that God might be responsible for such things but such speculations were not part of the scientific debate. For example, Newton himself is supposed to have believed that the stability of the solar system (which was an unexplained problem in his day and remained unsolved for about 200 years) was due to God periodically intervening to restore the initial conditions. But these ideas were never part of the scientific consensus. And we can see why. If scientists had said that the stability was due to God, and closed down that avenue of research, then scientists would never have solved this important problem by naturalistic means and thus advanced the cause of science. This is why scientists, as a community, never accept non-natural explanations for any phenomena, even though individual scientists may entertain such ideas.

So the attempts by IDC advocates to redefine science to leave out methodological naturalism and predictivity fly completely in the face of the history of science. But worse than that, such a move would result in undermining the very methods that has made science so successful.

In the next posting, we will see why just looking for ‘good’ explanations of scientific phenomena (the definition of science advocated by the IDC people) is not, by itself, a useful exercise for science.

What is science?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

Because of my interest in the history and philosophy of science I am sometimes called upon to answer the question “what is science?” Most people think that the answer should be fairly straightforward. This is because science is such an integral part of our lives that everyone feels that they intuitively know what it is and think that the problem of defining science is purely one of finding the right combination of words that captures their intuitive sense.

But as I said in my previous posting, strictly defining things means having demarcation criteria, which involves developing a set of necessary and sufficient conditions, and this is extremely hard to do even for seemingly simple things like (say) defining what a dog is. So I should not be surprising that it may be harder to do for an abstract idea like science.

But just as a small child is able, based on its experience with pets, to distinguish between a dog and a cat without any need for formal demarcation criteria, so can scientists intuitively sense what is science and what is not science, based on the practice of their profession, without any need for a formal definition. So scientists do not, in the normal course of their work, pay much attention to whether they have a formal definition of science or not. If forced to define science (say for the purpose of writing textbooks) they tend to make up some kind of definition that sort of fits with their experience, but such ad-hoc formulations lack the kind of formal rigor that is strictly required of a philosophically sound demarcation criterion.

The absence of an agreed-upon formal definition of science has not hindered science from progressing rapidly and efficiently. Science marches on, blithely unconcerned about its lack of self-definition. People start worrying about definitions of science mainly in the context of political battles, such as those involving so-called intelligent design creationism (or IDC), because advocates of IDC have been using this lack of a formal definition to try to define science in such a way that their pet idea be included as science, and thus taught in schools as part of the science curriculum and as an alternative to evolution.

Having a clear-cut demarcation criterion that defines science and is accepted by all would settle this question once and for all. But finding this demarcation criterion for science has proven to be remarkably difficult.

To set about trying to find such criteria, we do what we usually do in all such cases, we look at all the knowledge that is commonly accepted as science by everyone, and see if we can see similarities among these areas. For example, I think everyone would agree that the subjects that come under the headings of astronomy, geology, physics, chemistry, and biology, and which are studied by university departments in reputable universities, all come under the heading of science. So any definition of science that excluded any of these areas would be clearly inadequate, just as any definition of ‘dog’ that excluded a commonly accepted breed would be dismissed as inadequate.

This is the kind of thing we do when trying to define other things, like art (say). Any definition of art that excluded (say) paintings hanging in reputable museums would be considered an inadequate definition.

When we look back at the history of the topics studied by people in those named disciplines and which are commonly accepted as science, two characteristics stand out. The first thing that we realize is that for a theory to be considered scientific it does not have to be true. Newtonian physics is commonly accepted to be scientific, although it is not considered to be universally true anymore. The phlogiston theory of combustion is considered to be scientific though it has long since been overthrown by the oxygen theory. And so on. In fact, since all knowledge is considered to be fallible and liable to change, truth is, in some sense, irrelevant to the question of whether something is scientific or not, because absolute truth cannot be established.

(A caveat: Not all scientists will agree with me on this last point. Some scientists feel that once a theory is shown to be incorrect, it ceases to be part of science, although it remains a part of science history. Some physicists also feel that many of the current theories of (say) sub-atomic particles are unlikely to be ever overthrown and are thus true in some absolute sense. I am not convinced of this. The history of science teaches us that even theories that were considered rock-solid and lasted millennia (such as the geocentric universe) eventually were overthrown.)

But there is a clear pattern that emerges about scientific theories. All the theories that are considered to be science are (1) naturalistic and (2) predictive.

By naturalistic I mean methodological naturalism and not philosophical naturalism. The latter, I argued in an earlier posting where these terms were defined, is irrelevant to science.

By predictive, I mean that all theories that are considered part of science have the quality of having some explicit mechanism or structure that enable the users of these theories to make predictions, of saying what one should see if one did some experiment or looked in some place under certain conditions.

Note that these two conditions are just necessary conditions and by themselves are not sufficient. (See the previous posting for what those conditions mean.) As such they can only classify things into “may be science” (if something meets both conditions) or “not science” (if something does not meet either one of the conditions.) As such, these two conditions together do not make up a satisfactory demarcation criterion. For example, the theory that if a football quarterback throws a lot of interceptions his team is likely to lose, meets both naturalistic and predictive conditions, but it is not considered part of science.

But even though we do not have a rigorous demarcation criterion for science, the existence of just necessary conditions still has interesting implications, which we shall explore in later postings.