Charles Darwin in his own words

I have written a lot about the theory of evolution and in the process have quoted short excerpts from various authors, Charles Darwin included. (Please see here for previous posts in this series.)

But in going back and reading the first edition of On the Origin of Species (1859), I am struck by how prescient Darwin was in anticipating the objections that would be raised against his theory and why. He could well have been talking about the situation today, except that then the people who were skeptical and who he was trying to persuade were his scientific colleagues. Nowadays scientists are almost all converts to natural selection (as he predicted might happen) and it is religious lay people who make the same objections he addressed long ago.

To get the full flavor of Darwin’s thinking and his style of writing, here is a somewhat long passage from his conclusions, where he summarizes his case (p. 480-484). The sections in boldface are my own emphasis. (Darwin’s complete works are now available online.)

I have now recapitulated the chief facts and considerations which have thoroughly convinced me that species have changed, and are still slowly changing by the preservation and accumulation of successive slight favourable variations. Why, it may be asked, have all the most eminent living naturalists and geologists rejected this view of the mutability of species? It cannot be asserted that organic beings in a state of nature are subject to no variation; it cannot be proved that the amount of variation in the course of long ages is a limited quantity; no clear distinction has been, or can be, drawn between species and well-marked varieties. It cannot be maintained that species when intercrossed are invariably sterile, and varieties invariably fertile; or that sterility is a special endowment and sign of creation. The belief that species were immutable productions was almost unavoidable as long as the history of the world was thought to be of short duration; and now that we have acquired some idea of the lapse of time, we are too apt to assume, without proof, that the geological record is so perfect that it would have afforded us plain evidence of the mutation of species, if they had undergone mutation.

But the chief cause of our natural unwillingness to admit that one species has given birth to other and distinct species, is that we are always slow in admitting any great change of which we do not see the intermediate steps. The difficulty is the same as that felt by so many geologists, when Lyell first insisted that long lines of inland cliffs had been formed, and great valleys excavated, by the slow action of the coast-waves. The mind cannot possibly grasp the full meaning of the term of a hundred million years; it cannot add up and perceive the full effects of many slight variations, accumulated during an almost infinite number of generations.

Although I am fully convinced of the truth of the views given in this volume under the form of an abstract, I by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all viewed, during a long course of years, from a point of view directly opposite to mine. It is so easy to hide our ignorance under such expressions as the “plan of creation,” “unity of design,” &c., and to think that we give an explanation when we only restate a fact. Any one whose disposition leads him to attach more weight to unexplained difficulties than to the explanation of a certain number of facts will certainly reject my theory. A few naturalists, endowed with much flexibility of mind, and who have already begun to doubt on the immutability of species, may be influenced by this volume; but I look with confidence to the future, to young and rising naturalists, who will be able to view both sides of the question with impartiality. Whoever is led to believe that species are mutable will do good service by conscientiously expressing his conviction; for only thus can the load of prejudice by which this subject is overwhelmed be removed.

Several eminent naturalists have of late published their belief that a multitude of reputed species in each genus are not real species; but that other species are real, that is, have been independently created. This seems to me a strange conclusion to arrive at. They admit that a multitude of forms, which till lately they themselves thought were special creations, and which are still thus looked at by the majority of naturalists, and which consequently have every external characteristic feature of true species,—they admit that these have been produced by variation, but they refuse to extend the same view to other and very slightly different forms. Nevertheless they do not pretend that they can define, or even conjecture, which are the created forms of life, and which are those produced by secondary laws. They admit variation as a vera causa in one case, they arbitrarily reject it in another, without assigning any distinction in the two cases. The day will come when this will be given as a curious illustration of the blindness of preconceived opinion. These authors seem no more startled at a miraculous act of creation than at an ordinary birth. But do they really believe that at innumerable periods in the earth’s history certain elemental atoms have been commanded suddenly to flash into living tissues? Do they believe that at each supposed act of creation one individual or many were produced? Were all the infinitely numerous kinds of animals and plants created as eggs or seed, or as full grown? and in the case of mammals, were they created bearing the false marks of nourishment from the mother’s womb? Although naturalists very properly demand a full explanation of every difficulty from those who believe in the mutability of species, on their own side they ignore the whole subject of the first appearance of species in what they consider reverent silence.

It may be asked how far I extend the doctrine of the modification of species. The question is difficult to answer, because the more distinct the forms are which we may consider, by so much the arguments fall away in force. But some arguments of the greatest weight extend very far. . . I believe that animals have descended from at most only four or five progenitors, and plants from an equal or lesser number.

Analogy would lead me one step further, namely, to the belief that all animals and plants have descended from some one prototype. But analogy may be a deceitful guide. Nevertheless all living things have much in common, in their chemical composition, their germinal vesicles, their cellular structure, and their laws of growth and reproduction. We see this even in so trifling a circumstance as that the same poison often similarly affects plants and animals; or that the poison secreted by the gall-fly produces monstrous growths on the wild rose or oak-tree. Therefore I should infer from analogy that probably all the organic beings which have ever lived on this earth have descended from some one primordial form, into which life was first breathed.

And then the very last, almost poetic, words in the book (p. 490):

There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

Darwin’s achievements are truly magnificent, putting him in the same class as Einstein and Newton, among the greatest scientists of all time.

POST SCRIPT: The Larry Craig incident

Senator Larry Craig (R-Idaho) has taken some strong “family values” and anti-gay stands in the past, despite long standing rumors that he himself was gay. The recent news report that he had pleaded guilty to “lewd” conduct in a public restroom has caused speculation that his career is now over.

It is despicable to harass gays with anti-gay rhetoric and legislation, becoming even worse if those doing so are secretly gay themselves. But Talking Points Memo expresses well my unease with what happened to Craig in the most recent episode. It is not clear from published reports that he did anything that really warranted his arrest and that he was, as Josh Marshall says, essentially caught in a Catch-22 caused by his own risky behavior.

Glenn Greenwald documents the brazen contradictions that right-wingers are indulging in the way they respond to the recent Craig revelation, the reports that surfaced back in 2006 that he was gay, and the recent case of Senator David Vitter (R-Louisiana), another “family values” champion who was found to be a customer of prostitutes.

The history of western atheism-2: The beginnings of modern atheism

(For previous posts in this series, see here.)

The philosopher Rene Descartes (1596-1650) may have unwittingly been the trigger for the revival of freethinking during the Enlightenment. Although he always asserted his own fidelity to the teachings of the church, the clarity of his thinking about the mind-body relationship exposed some of the fundamental problems and contradictions that inevitably accompany religious beliefs.

Belief in god has always required a kind of dualistic ‘two different worlds and two different kinds of matter’ way of thinking, but usually left unexamined the thorny questions of how the two interacted. Descartes’ exposition on this duality and his attempts to find a way by which the world and matter of god interacted with the world and matter of people exposed the difficulties with dualism, problems which plague thoughtful believers to this day as they try to reconcile a scientific perspective with religious faith.

Jonathan Miller in Atheism: A Rough History of Disbelief suggests that the first modern philosopher to seriously challenge the basis of the existing religious orthodoxies was Thomas Hobbes (1588-1679). He advocated ‘monism’, the idea that only one kind of stuff exists, and that stuff is what we see as matter. This ruled out dualism, especially other kinds of non-material entities like the soul and god. Although Hobbes’s book Leviathan (1651) advocated a strict materialism of both human nature and knowledge, he was not really an atheist and might better be classified as one of the first modern deists, someone who allows for the existence of some prime mover who set the universe in motion but then does not interfere subsequently.

The official climate in Hobbes’ time was still strongly discouraging of any forms of skepticism and people had to be cautious about going against these norms of belief. Perhaps as a result of the alarm caused to the supporters of religion by the spread of the kind of views expressed by Hobbes, in 1694 the British parliament had a long debate and passed a bill that advocated the death penalty for blasphemy if anyone should deny divinity. Early drafts of the bill even included atheism as grounds for execution, although that was not included in the final law that was passed. But it gives us a sense of the degree of public opprobrium that one risked if one espoused any form of heterodoxy.

One can see the strong appeal of deism for freethinkers in those times. Deism allowed people to formally genuflect to god and maintain a stance of official belief in god while allowing the free reign of their intellect in all other matters, especially science, since in the deist framework god was never invoked to explain anything other than the original creation of the universe and its subsequent laws and maintained a strict hands-off policy after that. Since atheism could be grounds for persecution and punishment and even execution, it seems reasonable to suppose that many deists of those days may well have been closeted atheists.

The fact that many of the prominent leaders of the American revolution (such as George Washington, Thomas Jefferson, Benjamin Franklin, Ethan Allen, James Madison, and James Monroe) were deists and had no trouble advocating the constitutional separation of church and state makes sense in the light of this historical context. They were rebelling against the restrictive entanglements of religion with government back in England, while trying to be not too far ahead of their own populace in terms of religion. After all, there have always been influential religious zealots in America, some who even went to the extent of seeking out and executing witches, and it would not have been not politically expedient to disavow god altogether. Still, it is quite amazing how sophisticated in such matters the American political leadership of that time was, compared to the present day when leaders publicly express a bizarre belief that god is actually in personal contact with them, and some even do not accept the theory of evolution.

While Hobbes with his theory of monism laid the philosophical basis for modern atheism, Miller argues that he cannot be truly identified as the first atheist. Neither could philosopher David Hume (1711-1776) who followed in Hobbes’ footsteps. But both were definitely anti-religious and flirted publicly with atheism and it would not be surprising if they were privately so, since both dropped hints that they suspected that most people were a lot less pious than they publicly let on.

David Hume, writing in his The Natural History of Religion chapter XII (1757), suspected that there was a great deal of hypocritical piety among his contemporaries:

We may observe, that, notwithstanding the dogmatical, imperious style of all superstition, the conviction of the religionists, in all ages, is more affected than real, and scarcely ever approaches, in any degree, to that solid belief and persuasion, which governs us in the common affairs of life. Men dare not avow, even to their own hearts, the doubts which they entertain on such subjects: They make a merit of implicit faith; and disguise to themselves their real infidelity, by the strongest asseverations and most positive bigotry. But nature is too hard for all their endeavours, and suffers not the obscure, glimmering light, afforded in those shadowy regions, to equal the strong impressions, made by common sense and by experience. The usual course of men’s conduct belies their words, and shows, that their assent in these matters is some unaccountable operation of the mind between disbelief and conviction, but approaching much nearer to the former than to the latter.

One gets the impression that while the people of Hume’s time may not have publicly expressed disbelief, there were a lot of knowing winks and nudges exchanged when public piety was encountered.

I think that Hume is describing many people today as well.

Next in this series: The first published atheist.

POST SCRIPT: After Fredo

The Department of Justice, like the IRS, can function effectively only if perceived as above partisan politics. This is because unlike most other government agencies, they can wield great power over individuals and so any action they take has to be seen as not serving a partisan agenda.

Alfredo Gonzales instigated and presided over the almost complete politicization of the Justice Department, making it serve as an extension of the White House, and his welcome departure is being accompanied by calls that he be replaced by someone who will restore some semblance of independence and integrity to that institution.

I am not sanguine that this will happen and am not sure why people have such high hopes. The Bush administration has had a consistent track record of appointing as partisan a political hack as they can get away with to all positions. Right now, the only constraint on its excesses is that the Democrats have to approve the nominee, but I fully expect that the nominee will be someone who they think they can just squeak by the approval process.

This is one of those predictions where I hope I am wrong.

Reflections on the working poor

(Text of the talk given by me to the first year class at the Share the Vision program, Severance Hall, Cleveland, OH on Friday, August 24, 2007 at 1:00 pm. The common reading for the incoming class was David Shipler’s book The Working Poor: Invisible in America.)

Welcome to Case Western Reserve University! The people you will encounter here are very different from the people described in David Shipler’s book The Working Poor: Invisible in America and I would like to address the question: what makes that difference?

Two answers are usually given. One is that we live in a meritocracy, and that we got where we are because of our own virtues, that we are smarter or worked harder or had a better attitude and work ethic than those who didn’t make the cut. I am sure that everyone in this auditorium has been repeatedly told by their family and friends and teachers that they are good and smart, and it is tempting to believe it. What can be more gratifying than to be told that one’s success is due to one’s own ability and efforts? It makes it all seem so well deserved, that there is justice in the world.

Another answer is that luck plays an important role in educational success. I suspect that most of us were fortunate enough to be born into families that had most, if not all, of the following attributes: stable homes and families, good schools and teachers, safe environments, good health, and sufficient food and clothing. Others are not so fortunate and this negatively affects their performance in school.

But there is a third possibility that is not often discussed and that is that the educational system has been deliberately designed so that large numbers of people end up like the people in the book, people who not only have failed but more importantly have learned to think of themselves as failures.

This idea initially seems shocking. How can we want people to fail? Aren’t our leaders always exhorting everyone to aim high and succeed in education? But let’s travel back in time to the beginnings of widespread schooling in the US. In those early days, schooling was unplanned and focused more on meeting the needs of the learner and less on meeting the needs of the economy.

Recall that this was the time when the so-called robber barons were amassing huge personal wealth while the workers were having appalling working conditions. There was increasing concern that as the general public got more educated, more and more would realize and resent this unequal distribution of wealth.

This fear can be seen in an 1872 Bureau of Education document which speaks about the “problem of educational schooling”, according to which, “inculcating knowledge” teaches workers to be able to “perceive and calculate their grievances,” thus making them “more redoubtable foes” in labor struggles. (John Taylor Gatto, The Underground History of US Education (2003) p. 153, now available online.)

This was followed by the report in 1888 that said, “We believe that education is one of the principal causes of discontent of late years manifesting itself among the laboring classes.” (Gatto, p. 153)

The rising expectations of the general public had to be dampened and this was done by creating an education system that would shift the focus away from learning and more towards meeting the needs of the economy. And the economy then, like now, does not need or want everyone to be well educated.

After all, think what would happen if everyone got a good education and college degrees? Where would we get enough people like those in the book, willing to work for low wages, often with little or no benefits, at places like Wal-Mart so that we can buy cheap goods? Or at McDonalds so that we get cheap hamburgers? Or as cleaning staff at restaurants and hotels so that we can eat out often? Or in the fields and sweatshops so that we can get cheap food and clothes? As the French philosopher Voltaire pointed out long ago: “The comfort of the rich depends upon the abundance of the poor.”

One of the most influential figures in shifting education to meet the needs of the work force was Ellwood P. Cubberley, who wrote in 1905 that schools were to be factories “in which raw products, children, are to be shaped and formed into finished products… manufactured like nails, and the specifications for manufacturing will come from government and industry.” (Gatto, footnote on page 39 in the online edition of the book.)

He also wrote: “We should give up the exceedingly democratic idea that all are equal and that our society is devoid of classes.”

The natural conclusion of this line of reasoning was spelled out in a speech that Woodrow Wilson gave in 1909, three years before he was elected President of the United States. He said: “[W]e want to do two things in modern society. We want one class to have a liberal education. We want another class, a very much larger class of necessity, to forgo the privilege of a liberal education and fit themselves to perform specific difficult manual tasks.” (The Papers of Woodrow Wilson, vol. 18, 1908-1909, Princeton University Press, Princeton NJ, 1974, p. 597.)

So a third possible answer to why all of us are different from the people described in Shipler’s book is that the educational system is designed to make sure that only a small percentage (us) will succeed and a much larger percentage (like the people in the book) will fail.

But it is not enough to simply exclude people from success as they will resent it and rebel. After all, all people have had dreams of a good life. As Shipler writes on page 231: “Virtually all the youngsters I spoke with in poverty-ridden middle schools wanted to go on to college. . .Their ambitions spilled over the brims of their young lives.” They dreamed of becoming doctors, lawyers, nurses, archeologists, and policemen. But those dreams have to be crushed to meet the needs of the economy. But crushing people’s dreams carries risks.

The poet Langston Hughes warned what might happen in his poem A Dream Deferred:

What happens to a dream deferred?
Does it dry up 

like a raisin in the sun? 

Or fester like a sore– 

And then run? 

Does it stink like rotten meat? 

Or crust and sugar over– 

like a syrupy sweet?
Maybe it just sags 
like a heavy load.
Or does it explode?

In order to prevent people with crushed dreams from exploding, you have to make them resigned to their fate, to think it is their own fault, to consider themselves failures and unworthy. How do you do that? By making them repeatedly experience failure and discouragement so that by the time they reach high school or even middle school, their love for learning has been destroyed, they have been beaten down, their hopes and dreams crushed by being told repeatedly that they are lazy and no good, so that should not aim high and instead should they think of themselves as so worthless and invisible that it does not even matter if they show up for work or not.

And we have done that. Currently we have an educational system in which people do primarily blame themselves for failure. As Shipler writes in his preface: “Rarely are they infuriated by their conditions, and when their anger surfaces, it is often misdirected against their spouses, their children, or their co-workers. They do not usually blame their bosses, their government, their country, or the hierarchy of wealth, as they reasonably could. They often blame themselves, and they are sometimes right.”

So does this mean that everything that our proud parents and teachers have told us about how smart we are is false? No, that is still true. What is false is the widespread belief that all the other people are poor because they are intrinsically stupid or lazy or incompetent.

You are now in a place that values knowledge and inquiry and has the resources to satisfy your curiosity about almost anything. And all this knowledge is freely shared with you, limited only by your own desire to learn. But all that knowledge that you can gain should not to be used to distance yourself even further from those who have not been as fortunate as you, or to think of yourself as superior to them.

All this knowledge is given to you so that you can become a better steward of the planet, so that you will try and create the kind of world where more people, in fact all people, can live the same kind of life that you will lead.

POST SCRIPT: Bye, Bye, Fredo

Alberto Gonzales surely must rank as a front-runner for the worst Attorney General ever, despite strong competition from people like President Nixon’s John Mitchell. In fact, the administration of George W. Bush has strong candidates for the worst ever nods in all the major categories: President, Vice President, Secretary of Defense, Secretary of State, and National Security Advisor.

Truly this is an administration that can only be described in superlatives.

The history of western atheism-1: The ancient origins

In the BBC4 TV program Atheism: A Rough History of Disbelief, host Jonathan Miller states flatly right at the beginning, “This series is about the disappearance of something – religious faith. . . The history of the growing conviction that god does not exist.”

(The full three hour, three-part series can be seen starting at the beginning here. The price you pay for it being on YouTube is that each hour is chopped up into six ten-minute segments in order to meet the time restrictions. But the video and sound quality are excellent.)

Miller did a nice job of summarizing the rise and fall and rise again of freethinking. Strictly speaking, his is a survey of atheism just in the western world. In the eastern world of two millennia ago, the widespread acceptance of Confucianism, which placed very little emphasis on a god, and Buddhism, which required no belief in god, suggests that atheism was not perceived as negatively as in the west.

The Miller documentary is structured quite traditionally. It is long on voice-over narration by Miller as he walks through various imposing historical churches, museums, and other buildings and gazes upwards at portraits and statues of the people he is talking about, interspersed with interviews with scholars. It is Miller talking to the viewer in an informal, chatty way, interweaving the history of disbelief with his own journey to a comfortable atheism. But what it lacks in drama and glitz, it more than makes up in the low-key, understated charm that is characteristic of good BBC documentaries. The second and third hours are especially good as the pace picks up.

Miller points out that many of the early Greeks philosophers were freethinkers, highly skeptical of the idea of a god. It is interesting that in those very early days, the Greeks had a much more sophisticated view of god and religion than we have even now, and the program provides many wonderful quotes about religion and god as evidence.

Epicurus (341-271 BCE) posed the essential and, to my mind, the ultimate contradiction that believers in god face: How to explain the existence of evil.

Is god willing to prevent evil but not able? Then he is not omnipotent.
Is he able but not willing? Then he is malevolent.
Is god both able and willing? Then whence cometh evil?
Is he neither able nor willing? Then why call him god?

These questions are usually avoided by religious people by invoking ignorance, the ‘mysterious ways clause’, that says that god has reasons for allowing evil to occur which we are unable to comprehend, although it is not clear how they know that god does not want them to understand. But as the French philosopher Voltaire once said, “The truths of religion are never so well understood as by those who have lost the power of reasoning.”

Lucretius (circa 99-55 BCE) proposed a theory of the origins of religion and articulates an early formulation of naturalism: “Fear is the mother of all gods. Nature does all things spontaneously by herself without their meddling.”

Cicero (106-43 BCE) points out that it is obvious that there is no god and that much public piety is hypocritical and based on fear. “In this subject of the nature of the gods, the first question is do the gods exist or do they not? It is difficult, you will say, to deny that they exist. I would agree, if we were arguing the matter in a public assembly. But in a private discussion of this kind, it is perfectly easy to do so.”

In his Decline and Fall of the Roman Empire, Edward Gibbon writes: “The various modes of worship, which prevailed in the Roman world, were all considered by the people, as equally true; by the philosopher, as equally false; and by the magistrate, as equally useful.” But this has been misattributed to Seneca (circa 4 BCE-65 CE) as saying: “Religion is regarded by the common people as true, by the wise as false, and by rulers as useful.”

It is interesting that even though the climate for freethinking was better in the time of the early Greeks, Cicero’s quote illustrates that people who were skeptical about the existence of god still had to be discreet for fear of repercussions, something that has continued to this day, explaining why so many atheists still are fearful about proclaiming their disbelief publicly.

The conversion to Christianity by the Roman Emperor Constantine (280-337 CE) led to the rise of Christianity being the favored religion of the Roman Empire and the beneficiary of state patronage. It also resulted in forcing freethinkers to lay low in society, and the suppression of those early Greek writings that supported atheism. Heretics were persecuted and this practice became institutionalized with the various forms of the Inquisition by the church beginning around the 12th century. Recall that most ‘heretics’ were not atheists, but religious people who had views different from that of Catholic orthodoxy. This effectively led to the forcing of specific religious beliefs on people, requiring public affirmations of religious orthodoxy, a practice that has remained in force to this day as we see with politicians routinely spouting pieties.

The arrival of the renaissance around 1500 CE signaled a new time. The birth of the new sciences with Copernicus and Galileo and Newton was coupled with the rise of Arab scholars who had preserved and now resurrected those early Greek skeptical writings. All this led to a flowering of new kinds of thinking. But those early days of modern science did not by themselves lead to a rise of disbelief or atheism. After all, those well-known scientists were all pious people, not skeptics. They simply felt that it was inconceivable that science would reveal anything that was incompatible with god’s work in the world so they did not seem to suffer any personal anxieties of disbelief about where their research would lead. They felt that any seeming contradiction between scientific knowledge and the Bible had to be due to a misinterpretation of the Bible. So they were far more sophisticated than current day Biblical literalists who lay the blame for the same conflicts at the feet of faulty science, not religious texts.

When Galileo was asked by the church to explain the conflict between his views and the Bible, he said, quite reasonably, that the church had no choice but to agree with whatever knowledge science was producing. He said it would be “a terrible detriment for the souls if people found themselves convinced by proof of something that it was made a sin to believe.” (Almost Like a Whale, Steve Jones, 1999, p. 26) Of course, the Catholic Church did not heed his views, putting him under house arrest, and it is amazing that it was only as late as 1984 that they officially apologized for their treatment of him.

So even during the period called the ‘enlightenment’ (roughly 1500-1800 CE), there continued to be a climate where freethinking was discouraged, with severe penalties for blasphemy. The Inquisition was also gaining strength around this time, forcing freethinkers to suppress public disavowals of god or even of Christian orthodoxy. In this climate, the re-emergence of skeptical beliefs necessarily had to be very cautious and incremental.

Next in this series: The beginnings of modern atheism.

POST SCRIPT: Question: What is a non sequitur?

Miss Teen USA 2007 finalist provides an illustration.

The journey to atheism

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from August 8, 2005, edited and updated.)

In a comment to a previous post, Jim Eastman said something that struck me as very profound. He said:

It’s also interesting to note that most theists are also in the game of declaring nonexistence of deities, just not their own. This quote has been sitting in my quote file for some time, and it seems appropriate to unearth it.

“I contend we are both atheists – I just believe in one fewer god than you do. When you understand why you reject all other gods, you will understand why I reject yours as well.” – Stephen F. Roberts

The Roberts quote captures accurately an important stage in my own transition from belief to atheism. Since I grew up as a Christian in a multi-religious society and had Hindu, Muslim, and Buddhist friends, I had to confront the question of how to deal with other religions. My answer at that time was simple – Christianity was right and the others were wrong. Of course, since the Methodist Church I belonged to had an inclusive, open, and liberal theological outlook, I did not equate this distinction with good or evil or even heaven and hell. I felt that as long as people were good and decent, they were somehow all saved, irrespective of what they believed. But there was no question in my mind that Christians had the inside track on salvation and that others were at best slightly misguided.
[Read more…]

Waiting for the Rapture

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from May 9, 2005, edited and updated.)

I am a huge fan of the English comic writer P. G. Wodehouse, especially of his Jeeves and Wooster books. The plots are pretty much the same in all the Jeeves stories but the smoothness of Wodehouse’s writing, his superb comic touch, and his brilliant choice of words make him a joy to read. Even though I have read all of the Jeeves books many times and know all the plots, they still have the ability to make me laugh out loud.
[Read more…]

What makes us change our minds?

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from March 28, 2005, edited and updated.)

In an earlier post, I described the three kinds of challenges teachers face. Today I want to discuss how teachers might deal with each case.

On the surface, it might seem that the first kind of challenge (where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way) is the easiest one. After all, if students have no strong beliefs or prior knowledge about what is being taught, then they should be able to accept the new knowledge more easily.

That is true, but the ease of acceptance also has its downside. The very act of not caring means that the new knowledge goes in easily but is also liable to be forgotten easily once the course is over. In other words, it might have little lasting impact. Since the student has little prior knowledge in that area, there is little in the brain to anchor the new knowledge to. And if the student does not care about it one way or the other, then no effort will be made by the student to really connect to the material. So the student might learn this material by mostly memorizing it, reproduce it on the exams, and forget it a few weeks later.

The research on the brain indicates that lasting learning occurs when students tie new knowledge to things they already know, when they integrate it with existing material. So teachers of even highly technical topics need to find ways to connect it with students’ prior knowledge. They have to know their students, what interests them, what concerns them, what they care about. This is why good teachers tie their material in some way to stories or topics that students know and care about or may be in the news or to controversies. Such strategies tap into the existing knowledge structures in the brain (the neural networks) and connect the new material to them, so that it is more likely to ‘stick.’

The second kind of challenge is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. A teacher who does not take these existing beliefs into account when designing teaching strategies is likely to be wasting her time. Because these beliefs are so strongly, but unconsciously held, they are not easily dislodged or modified.

The task for the teacher in this case is to make students aware of their existing knowledge structures and the implications of them for understanding situations. A teacher needs to create situations (say experiments or cases) and encourage students to explore the consequences of the their prior beliefs and see what happens when they are confronted by these new experiences. This has to be done repeatedly in newer and more enriched contexts so that students realize for themselves the existence and inadequacy of their prior knowledge structures and become more accepting of the new knowledge structures and theories.

In the third case, students are consciously rejecting the new ideas because they are aware that it conflicts with views they value more (for whatever reason). This is the situation with those religious people who reject evolutionary ideas because they conflict with their religious beliefs. In such cases, there is no point trying to force or browbeat them into accepting the new ideas.

Does this mean that such people’s ideas never change? Obviously not. People do change their views on matters that they may have once thought were rock-solid. In my own case, I know that I now believe things that are diametrically opposed to things that I once thought were true, and I am sure that my experience is very common.

But the interesting thing is that although I know that my views have changed, I cannot tell you when they changed or why they changed. It is not as if there was an epiphany where you slap your forehead and exclaim “How could I have been so stupid? Of course I was wrong and the new view is right!” Rather, the process seems more like being on an ocean liner that is turning around. The process is so gentle that you are not aware that it is even happening, but at some point you realize that you are facing in a different direction. There may be a moment of realization that you now believe something that you did not before, but that moment is just an explicit acknowledgment of something that that you had already tacitly accepted.

What started the process of change could be one of many factors – something you read, a news item, a discussion with a friend, some major public event – whose implications you may not be immediately aware of. But over time these little things lodge in your mind, and as your mind tries to integrate them into a coherent framework, your views start to shift. For me personally, I enjoy discussions of deep ideas with people I like and respect. Even if they do not have any expertise in this area, discussions with such people tend to clarify one’s ideas.

I can see that process happening to me right now with the ideas about the brain. I used to think that the brain was quite plastic, that any of us could be anything given the right environment. I am not so sure now. The work of Chomsky on linguistics, the research on how people learn, and other bits and pieces of knowledge I have read have persuaded me that it is not at all clear that the perfectly-plastic-brain idea can be sustained. It seems reasonable that some structures of the brain, especially the basic ones that enable it to interpret the input from the five senses, and perhaps even learn language, must be pre-existing.

But I am not completely convinced of the socio-biological views of people like E. O. Wilson and Steven Pinker who seem to argue that much of our brains, attitudes, and values are biologically determined by evolutionary adaptation. I am also not convinced of the value of much of popular gender-related differences, such as that men are better than women at math or that women are more nurturing than men. That seems to me to be a little too pat. I am always a little skeptical of attempts to show that the status quo is ‘natural’ since that has historically been used to justify inequality and oppression.

But the works of cognitive scientists are interesting and I can see my views on how the brain works changing slowly. One sign of this is my desire to read widely on the subject.

So I am currently in limbo as regards the nature of the brain, mulling things over. At some point I might arrive at some kind of unified and coherent belief structure. And after I do so, I may well wonder if I ever believed anything else. Such are the tricks the brain can play on you, to make you think that what you currently believe is what is correct and what you always believed.

POST SCRIPT: The Church of the Wholly Undecided

Les Barker has a funny poem about agnosticism.

The purpose of teaching

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from March 24, 2005, edited and updated.)

I have been teaching for many years and encountered many wonderful students. I remember in particular two students who were in my modern physics courses that dealt with quantum mechanics, relativity, and cosmology.

Doug was an excellent student, demonstrating a wonderful understanding of all the topics we discussed in class. But across the top of his almost perfect final examination paper, I was amused to see that he had written, “I still don’t believe in relativity!”

The other student was Jamal and he is not as direct as Doug. He came into my office a few years after the course was over (and just before he was about to graduate) to say goodbye. We chatted awhile, I wished him well, and then as he was about to leave he turned to me and said hesitantly in his characteristically shy way: “Do you remember that stuff you taught us about how the universe originated in the Big Bang about 15 billion years ago? Well, I don’t really believe all that.” After a pause he went on, “It kind of conflicts with my religious beliefs.” He looked apprehensively at me, perhaps to see if I might be offended or angry or think less of him. But I simply smiled and let it pass. It did not bother me at all.

Why was I not upset that these two students had, after having two semester-long courses with me, still not accepted the fundamental ideas that I had been teaching? The answer is simple. The goal of my teaching is not to change what my students believe. It is to have them understand what practitioners in the field believe. And those are two very different teaching goals.

As I said, I have taught for many years. And it seems to me that teachers encounter three kinds of situations with students.

One is where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way. This is usually the case with technical or highly specialized areas (such as learning the symptoms of some rare disease or applying the laws of quantum mechanics to the hydrogen atom). In such cases, students have little trouble accepting what is taught.

The second type of situation is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. The physics education literature is full of examples that our life experiences conspire to create in people an Aristotelian understanding of mechanics. This makes it hard for them to accept Newtonian mechanics. Note that this difficulty exists even though the students have no particular attachment to Aristotle’s views on mechanics and may not have the faintest idea what they are. Overcoming this kind of implicit belief structure is not easy. Doug was an example of someone who had got over the first hurdle from Aristotelian to Newtonian mechanics, but was finding the next transition to Einsteinian relativistic ideas much harder to swallow.

The third kind of situation is where the student has strong and explicit beliefs about something. These kinds of beliefs, as in the case of Jamal, come from religion or politics or parents or other major influences in their lives. You cannot force such students to change their views and any instructor who tries to do so is foolish. If students think that you are trying to force them to a particular point of view, they are very good at telling you what they think you want to hear, while retaining their beliefs. In fact, trying to force or bully students to accept your point of view, apart from being highly unethical teaching practice, is a sure way of reinforcing the strength of their original views.

So Doug’s and Jamal’s rejection of my ideas did not bother me and I was actually pleased that they felt comfortable telling me so. They had every right to believe whatever they wanted to believe. But what I had a right to expect was that they had understood what I was trying to teach and could use those ideas to make arguments within those frameworks.

For example, if I had given an exam problem that required that the student demonstrate his understanding of relativistic physics to solve, and Doug had refused to answer the question because he did not believe in relativity or had answered it using his own private theories of physics, I would have had to mark him down.

Similarly, if I had asked Jamal to calculate the age of the universe using the cosmological theories we had discussed in class, and he had instead said that the universe was 6,000 years old because that is what the Bible said, then I would have to mark him down too. He is free to believe what he wants, but the point of the course is to learn how the physics community interprets the world, and be able to use that information.

Understanding this distinction is important because of the periodic appearance of demagogues who try to frighten people by asserting that colleges are indoctrinating students to think in a particular way. Such people seem to assume that students are like sheep who can be induced to believe almost anything the instructor wants them to and thus require legal protection. Anyone who has taught for any length of time and has listened closely to students will know that this is ridiculous. It is not that students are not influenced by teaching and do not change their minds but that the process is far more complex and subtle than it is usually portrayed, as I will discuss in the next posting.

My own advice to students is to listen carefully and courteously to what knowledgeable people have to say, learn what the community of scholars thinks about an issue, and be able to understand and use that information when necessary. Weigh the arguments for and against any issue but ultimately stand up for what you believe and even more importantly know why you believe it. Don’t ever feel forced to accept something just because some ‘expert’ or other authority figure (teacher, preacher, parent, political leader, pundit, or media talking head) tells you it is true. Believe things only when it makes sense to you and you are good and ready for it.

“I know this is politically incorrect but . . .”

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from August 14, 2006, edited and updated.)

One of the advantages of being older is that sometimes you can personally witness how language evolves and changes, and how words and phrases undergo changes and sometimes outright reversals of meaning.

One of the interesting evolutions is that of the phrase “politically correct.” It was originally used as a kind of scornful in-joke within Marxist political groups to sneer at those members who seemed to have an excessive concern with political orthodoxy and who seemed to be more preoccupied with vocabulary than with the substance of arguments and actions.

Later it became used as a weapon against those who were trying to make language more nuanced and inclusive and less hurtful, judgmental, or discriminatory. Such people advocated using “disabled” instead of “crippled” or “mentally ill” instead of “crazy,” or “hearing impaired” instead of “deaf” and so on in an effort to remove the stigma under which those groups had traditionally suffered. Those who felt such efforts had been carried to an extreme, or just wanted to use words the way they always had, disparaged those efforts as trying to be “politically correct.”

The most recent development has been to shift the emphasis from sneering at the careful choosing of words to sneering at the ideas and sentiments behind those words. The phrase has started being used pre-emptively, to shield people from the negative repercussions of stating views that otherwise may be offensive or antiquated. This usage usually begins by saying “I know this is politically incorrect but….” and then finishes up by making a statement that would normally provoke quick opposition.

So you can now find people saying “I know this is politically incorrect but perhaps women are inferior to men at mathematics and science” or “I know this is politically incorrect but perhaps poor people are poor because they are stupid” or “I know this is politically incorrect but perhaps blacks are less capable than whites at academics.” The opening preamble is not only designed to make such statements acceptable, the speaker can even claim the mantle of being daring and brave, an outspoken and even heroic bearer of unpopular or unpalatable truths.

Take for example, a blurb by intelligent design creationist Jonathan Wells for his own book The Politically Incorrect Guide to Darwinism and Intelligent Design. The cover of the book says: “Darwin is an emperor who has no clothes— but it takes a brave man to say so. Jonathan Wells, a microbiologist with two Ph.D.s (from Berkeley and Yale), is that brave man.” There have been similar books that try this same linguistic maneuver, such as The Politically Incorrect Guide to Science and The Politically Incorrect Guide to Global Warming (and Environmentalism).

Brandishing the label of being ‘politically incorrect’ as a form of argument is silly, as is invoking the fact that one has a doctorate. It is actually a sign of weakness, indicating that one’s arguments cannot stand on their own. For example, physicists assume that all electrons are identical. We don’t really know this for a fact, since it is impossible to compare all electrons. The statement “all electrons are identical” is a kind of default position and, in the absence of evidence to the contrary, does not need to be supported by positive evidence. The assertion that “some electrons are heavier than others” is going to be dismissed in the absence of supporting evidence. Simply saying ” I know this is not politically correct but I believe some electrons are heavier than others and I have a PhD” does not make it any more credible. It merely makes you look pompous and self-aggrandizing.

Sentiments that would normally would be considered discriminatory, biased, and outright offensive if uttered without any supporting evidence are now protected from criticism by this preamble. It is then the person who challenges this view who is put on the defensive, as if he or she was some prig who unthinkingly spouts an orthodox view.

Fintan O’Toole of The Irish Times (May 5, 1994) noted this trend early and pithily said:

We have now reached the point where every goon with a grievance, every bitter bigot, merely has to place the prefix, “I know this is not politically correct but . . .” in front of the usual string of insults in order to be not just safe from criticism but actually a card, a lad, even a hero. Conversely, to talk about poverty and inequality, to draw attention to the reality that discrimination and injustice are still facts of life, is to commit the new sin of political correctness……… Anti-PC has become the latest cover for creeps. It is a godsend for every sort of curmudgeon or crank, from the fascistic to the merely smug.

Hate blacks? Attack positive discrimination – everyone will know the codes. Want to keep Europe white? Attack multiculturalism. Fed up with the girlies making noise? Tired of listening to whining about unemployment when your personal economy is booming? Haul out political correctness and you don’t even have to say what’s on your mind.

Even marketers are cashing in on this anti-PC fad, as illustrated by this cartoon.

Here’s a tip. Anyone who feels the need to invoke the “politically incorrect” trope as an indicator of his or her valor is probably trying to hide the weaknesses in their argument.

POST SCRIPT: Comparing the candidates

How do the presidential candidates compare when it comes to where they stand on the left-right and authoritarian-libertarian continua?

You can see for yourself, based on their positions on a range of issues.

I found it interesting (but not surprising) that every candidate of both parties (except for Democrats Dennis Kucinich and Mike gravel) ended up in the right-wing /authoritarian quadrant.

You can also answer the questions yourself and compare yourself to them. My scores put me in the deep southwest part in the left-libertarian quadrant, more so than Kucinich and Gravel.

These kinds of things are fun but should not be considered a serious analysis of political philosophies.

What do creationist/ID advocates want-III?

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some very early ones, updated if necessary. Today’s one is from March 18, 2005, edited and updated.)

It is time to tackle head-on the notion of what is meant by the ‘materialism’ that the intelligent design creationism (IDC) camp find so distasteful. (See part I and part II for the background.)

The word materialism is used synonymously with ‘naturalism’ and perhaps the clearest formulation of what it means can be found in the writings of paleontologist George Gaylord Simpson who said in Tempo and Mode in Evolution (p. 76.):

The progress of knowledge rigidly requires that no non-physical postulate ever be admitted in connection with the study of physical phenomena. We do not know what is and what is not explicable in physical terms, and the researcher who is seeking explanations must seek physical explanations only. (Emphasis added)

Simpson was by not an atheist (as far as I can tell) but he is saying something that all scientists take for granted, that when you seek a scientific explanation for something, you look for something that has natural causes, and you do not countenance the miraculous or the inscrutable. This process is more properly called ‘methodological naturalism’, to be contrasted with ‘philosophical naturalism.’

Despite the polysyllabic terminology, the ideas are easy to understand. For example, if you hear a strange noise in the next room, you might wonder if it is a radiator or the wind or a mouse or an intruder. You can systematically investigate each possible cause, looking for evidence. For each question that you pose, the answer is sought in natural causes. You would be unlikely to say: “The noise in the next room is caused by god throwing stuff around.” In general, people don’t invoke god to explain the everyday phenomena of our lives, even though they might be quite religious.

Methodological naturalism is just that same idea. Scientists look for natural explanations to the phenomena they encounter because that is the way science works. Such an approach allows you to systematically investigate open questions and not shut off avenues of research. Any scientist who said that an experimental result was due to God intervening in the lab would be looked at askance, not because other scientists are all atheists determined to stamp out any form of religion but because that scientist would be violating one of the fundamental rules of operation. There is no question in science that is closed to further investigation of deeper natural causes.

Non-scientists sometimes do not understand how hard and frustrating much of scientific research is. People work for years and even decades banging their heads against brick walls, trying to solve some tough problem. What keeps them going? What makes them persevere? It is the practice of methodological naturalism, the belief that a discoverable explanation must exist and that it is only their ingenuity and skill that is preventing them from finding the solution. Unsolved problems are seen as challenges to the skills of the individual scientist and the scientific community, not as manifestations of god’s workings.

This is what, for example, causes medical researchers to work for years to find causes (and thus possibly cures) for rare and obscure diseases. Part of the reason is the desire to be helpful, part of it is due to personal ambition and career advancement, but an important part is also the belief that a solution exists that lies within their grasp.

It is because of this willingness to persevere in the face of enormous difficulty that science has been able to make the breakthroughs it has. If, at the early signs of difficulty in solving a problem scientists threw up their hands and said “Well, looks like god is behind this one. Let’s give up and move on to something else” then the great discoveries of science that we associate with Newton, Darwin, Einstein, Planck, Heisenberg, etc. would never have occurred.

For example, the motion of the perigee of the moon was a well-known unsolved problem for over sixty years after the introduction of Newtonian physics. It constituted a serious problem that resisted solution for a longer time than the problems in evolution pointed to by IDC advocates. Yet no supernatural explanation was invoked, eventually the problem was solved, and the result was seen as a triumph for Newtonian theory.

So when IDC advocates advocate the abandonment of methodological naturalism, they are not trying to ease just Darwin out of the picture. They are throwing out the operational basis of the entire scientific enterprise.

Philosophical (or ontological) naturalism, as contrasted with methodological naturalism, is the belief that the natural world is all there is, that there is nothing more. Some scientists undoubtedly choose to be philosophical naturalists (and thus atheists) because they see no need to have god in their philosophical framework, but as I said in an earlier posting, others reject that option and stay religious. But this is purely a personal choice made by individual scientists and it has no impact on how they do science, which only involves using methodological naturalism. There is no requirement in science that one must be a philosophical naturalist, and as I alluded to earlier, there is little evidence that Gaylord Simpson was a philosophical naturalist although he definitely was a methodological naturalist.

The question of philosophical naturalism is, frankly, irrelevant to working scientists. Scientists don’t really care if their colleagues are religious or not. I have been around scientists all my life. But apart from my close friends, I have no idea what their religious beliefs are, and even then I have only a vague idea of what they actually believe. I know that some are religious and others are not. Whether a scientist is a philosophical naturalist or not does not affect how his or her work is received by the community. It just does not matter.

But what the IDC advocates want, according to their stated goal of “If things are to improve, materialism needs to be defeated and God has to be accepted as the creator of nature and human beings” is to enforce the requirement that scientists reject both philosophical and methodological naturalism. They are essentially forcing two things on everyone:

  • Requiring people to adopt the IDC religious worldview as their own.
  • Requiring scientists to reject methodological naturalism as a rule of operation for science.

In other words, IDC advocates are not asking us to reject only Darwin or to turn the scientific clock back to the time just prior to Darwin, they want us to go all the way back to before Copernicus, and reject the very methods of science that has enabled it to be so successful. They want us to go back to a time of rampant and unchecked superstition.

This is not a good idea.