The history of western atheism-3: The first published atheist

(For previous posts in this series, see here.)

In his BBC4 TV series A Rough History of Atheism Jonathan Miller awards the honor of being the first published atheist to France’s Paul Henri Thiery, Baron D’Holbach (1723-1789). As the Encyclopedia Brittanica entry on him says:

His most popular book, Système de la nature (1770) (“The System of Nature”), published under the name of J.B. Mirabaud, caustically derided religion and espoused an atheistic, deterministic Materialism: causality became simply relationships of motion, man became a machine devoid of free will, and religion was excoriated as harmful and untrue. In Le Christianisme dévoilé (1761; “Christianity Unveiled”), published under the name of a deceased friend, N.A. Boulanger, he attacked Christianity as contrary to reason and nature.

It is said that the Baron’s salon was a congenial meeting place for all manner of freethinkers, including Benjamin Franklin during his stay in France, but some of his guests were so alarmed at the inflammatory nature of the speculations that occurred that they stopped coming. Even a nobleman like D’Holbach had to be cautious about his views, as atheism was grounds for persecution and even execution, so his works on these subjects were published pseudonymously.

When you read the Baron’s views, one can understand his caution. Here is a sample of his writings, which are bracingly direct and modern:

  • If we go back to the beginning we shall find that ignorance and fear created the gods; that fancy, enthusiasm, or deceit adorned or disfigured them; that weakness worships them; that credulity preserves them, and that custom, respect and tyranny support them in order to make the blindness of men serve its own interests.
  • If the ignorance of nature gave birth to gods, the knowledge of nature is calculated to destroy them.
  • All religions are ancient monuments to superstitions, ignorance, ferocity; and modern religions are only ancient follies rejuvenated.
  • All children are atheists — they have no idea of God.
  • What has been said of [God] is either unintelligible or perfectly contradictory; and for this reason must appear impossible to every man of common sense.
  • The Jehovah of the Jews is a suspicious tyrant, who breathes nothing but blood, murder, and carnage, and who demands that they should nourish him with the vapours of animals. The Jupiter of the Pagans is a lascivious monster. The Moloch of the Phoenicians is a cannibal. The pure mind of the Christians resolved, in order to appease his fury, to crucify his own son. The savage god of the Mexicans cannot be satisfied without thousands of mortals which are immolated to his sanguinary appetite.
  • Many men without morals have attacked religion because it was contrary to their inclinations. Many wise men have despised it because it seemed to them ridiculous. Many persons have regarded it with indifference, because they have never felt its true disadvantages. But it is as a citizen that I attack it, because it seems to me harmful to the happiness of the state, hostile to the march of the mind of man, and contrary to sound morality, from which the interests of state policy can never be separated.
  • Tolerance and freedom of thought are the veritable antidotes to religious fanaticism.
  • Religion has ever filled the mind of man with darkness, and kept him in ignorance of his real duties and true interest. It is only by dispelling the clouds and phantoms of Religion, that we shall discover Truth, Reason, and Morality. Religion diverts us from the causes of evils, and from the remedies which nature prescribes; far from curing, it only aggravates, multiplies, and perpetuates them.

Pretty strong stuff, especially for the 18th century, and one can understand why the good Baron was wary of saying these things under his own name. But there is nothing in the above list that any modern atheist would disagree with.

Baron D’Holbach’s writings are said to have been extremely influential, perhaps because they said so directly what had been thought secretly for so long in the minds of many thoughtful people. It is very likely that his works were well known to Erasmus Darwin (1731-1802), Charles Darwin’s grandfather, who was himself a radical freethinker and who had published his own Lamarckian theory of evolution in the book Zoonomia which was published around 1795.

Although Charles Darwin started out as a religious person and was contemplating becoming an Anglican clergyman early on, there is little doubt that the disbelief of his father and grandfather and brother were factors in his later move away from religion. He knew them to be good and decent people and the thought that they would be punished and suffer torments simply because of their disbelief was impossible for him to accept. As he wrote in his autobiography (The Reluctant Mr. Darwin, David Quammen, p. 246):

I can hardly see how anyone ought to wish Christianity to be true: for if so the plain language of the text seems to show that men who do not believe, and this would include my Father, Brother, and almost all my best friends, will be everlastingly punished.

And this is a damnable doctrine.

The philosopher Robert Ingersoll (1833-1899) said that “The notion that faith in Christ is to be rewarded by an eternity of bliss, while a dependence upon reason, observation, and experience merits everlasting pain, is too absurd for refutation, and can be relieved only by that unhappy mixture of insanity and ignorance, called “faith.”” Darwin would probably have sympathized with the statement although, being someone who avoided social controversy, he probably would not have stated it so strongly.

It is interesting to see the interweaving of threads of ideas of religion and science and atheism in those times. Was it the atheist writings of people like D’Holmbach that opened up the creative window for Charles Darwin, Charles Lyell, and other scientists, freeing them from the constraints of having their science strictly conform to religious dogma? It is hard to say. But the more liberal climate definitely would have helped.

Next in this series: Atheism shifts from the intellectuals to the masses.

Charles Darwin in his own words

I have written a lot about the theory of evolution and in the process have quoted short excerpts from various authors, Charles Darwin included. (Please see here for previous posts in this series.)

But in going back and reading the first edition of On the Origin of Species (1859), I am struck by how prescient Darwin was in anticipating the objections that would be raised against his theory and why. He could well have been talking about the situation today, except that then the people who were skeptical and who he was trying to persuade were his scientific colleagues. Nowadays scientists are almost all converts to natural selection (as he predicted might happen) and it is religious lay people who make the same objections he addressed long ago.

To get the full flavor of Darwin’s thinking and his style of writing, here is a somewhat long passage from his conclusions, where he summarizes his case (p. 480-484). The sections in boldface are my own emphasis. (Darwin’s complete works are now available online.)

I have now recapitulated the chief facts and considerations which have thoroughly convinced me that species have changed, and are still slowly changing by the preservation and accumulation of successive slight favourable variations. Why, it may be asked, have all the most eminent living naturalists and geologists rejected this view of the mutability of species? It cannot be asserted that organic beings in a state of nature are subject to no variation; it cannot be proved that the amount of variation in the course of long ages is a limited quantity; no clear distinction has been, or can be, drawn between species and well-marked varieties. It cannot be maintained that species when intercrossed are invariably sterile, and varieties invariably fertile; or that sterility is a special endowment and sign of creation. The belief that species were immutable productions was almost unavoidable as long as the history of the world was thought to be of short duration; and now that we have acquired some idea of the lapse of time, we are too apt to assume, without proof, that the geological record is so perfect that it would have afforded us plain evidence of the mutation of species, if they had undergone mutation.

But the chief cause of our natural unwillingness to admit that one species has given birth to other and distinct species, is that we are always slow in admitting any great change of which we do not see the intermediate steps. The difficulty is the same as that felt by so many geologists, when Lyell first insisted that long lines of inland cliffs had been formed, and great valleys excavated, by the slow action of the coast-waves. The mind cannot possibly grasp the full meaning of the term of a hundred million years; it cannot add up and perceive the full effects of many slight variations, accumulated during an almost infinite number of generations.

Although I am fully convinced of the truth of the views given in this volume under the form of an abstract, I by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all viewed, during a long course of years, from a point of view directly opposite to mine. It is so easy to hide our ignorance under such expressions as the “plan of creation,” “unity of design,” &c., and to think that we give an explanation when we only restate a fact. Any one whose disposition leads him to attach more weight to unexplained difficulties than to the explanation of a certain number of facts will certainly reject my theory. A few naturalists, endowed with much flexibility of mind, and who have already begun to doubt on the immutability of species, may be influenced by this volume; but I look with confidence to the future, to young and rising naturalists, who will be able to view both sides of the question with impartiality. Whoever is led to believe that species are mutable will do good service by conscientiously expressing his conviction; for only thus can the load of prejudice by which this subject is overwhelmed be removed.

Several eminent naturalists have of late published their belief that a multitude of reputed species in each genus are not real species; but that other species are real, that is, have been independently created. This seems to me a strange conclusion to arrive at. They admit that a multitude of forms, which till lately they themselves thought were special creations, and which are still thus looked at by the majority of naturalists, and which consequently have every external characteristic feature of true species,—they admit that these have been produced by variation, but they refuse to extend the same view to other and very slightly different forms. Nevertheless they do not pretend that they can define, or even conjecture, which are the created forms of life, and which are those produced by secondary laws. They admit variation as a vera causa in one case, they arbitrarily reject it in another, without assigning any distinction in the two cases. The day will come when this will be given as a curious illustration of the blindness of preconceived opinion. These authors seem no more startled at a miraculous act of creation than at an ordinary birth. But do they really believe that at innumerable periods in the earth’s history certain elemental atoms have been commanded suddenly to flash into living tissues? Do they believe that at each supposed act of creation one individual or many were produced? Were all the infinitely numerous kinds of animals and plants created as eggs or seed, or as full grown? and in the case of mammals, were they created bearing the false marks of nourishment from the mother’s womb? Although naturalists very properly demand a full explanation of every difficulty from those who believe in the mutability of species, on their own side they ignore the whole subject of the first appearance of species in what they consider reverent silence.

It may be asked how far I extend the doctrine of the modification of species. The question is difficult to answer, because the more distinct the forms are which we may consider, by so much the arguments fall away in force. But some arguments of the greatest weight extend very far. . . I believe that animals have descended from at most only four or five progenitors, and plants from an equal or lesser number.

Analogy would lead me one step further, namely, to the belief that all animals and plants have descended from some one prototype. But analogy may be a deceitful guide. Nevertheless all living things have much in common, in their chemical composition, their germinal vesicles, their cellular structure, and their laws of growth and reproduction. We see this even in so trifling a circumstance as that the same poison often similarly affects plants and animals; or that the poison secreted by the gall-fly produces monstrous growths on the wild rose or oak-tree. Therefore I should infer from analogy that probably all the organic beings which have ever lived on this earth have descended from some one primordial form, into which life was first breathed.

And then the very last, almost poetic, words in the book (p. 490):

There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

Darwin’s achievements are truly magnificent, putting him in the same class as Einstein and Newton, among the greatest scientists of all time.

POST SCRIPT: The Larry Craig incident

Senator Larry Craig (R-Idaho) has taken some strong “family values” and anti-gay stands in the past, despite long standing rumors that he himself was gay. The recent news report that he had pleaded guilty to “lewd” conduct in a public restroom has caused speculation that his career is now over.

It is despicable to harass gays with anti-gay rhetoric and legislation, becoming even worse if those doing so are secretly gay themselves. But Talking Points Memo expresses well my unease with what happened to Craig in the most recent episode. It is not clear from published reports that he did anything that really warranted his arrest and that he was, as Josh Marshall says, essentially caught in a Catch-22 caused by his own risky behavior.

Glenn Greenwald documents the brazen contradictions that right-wingers are indulging in the way they respond to the recent Craig revelation, the reports that surfaced back in 2006 that he was gay, and the recent case of Senator David Vitter (R-Louisiana), another “family values” champion who was found to be a customer of prostitutes.

The history of western atheism-2: The beginnings of modern atheism

(For previous posts in this series, see here.)

The philosopher Rene Descartes (1596-1650) may have unwittingly been the trigger for the revival of freethinking during the Enlightenment. Although he always asserted his own fidelity to the teachings of the church, the clarity of his thinking about the mind-body relationship exposed some of the fundamental problems and contradictions that inevitably accompany religious beliefs.

Belief in god has always required a kind of dualistic ‘two different worlds and two different kinds of matter’ way of thinking, but usually left unexamined the thorny questions of how the two interacted. Descartes’ exposition on this duality and his attempts to find a way by which the world and matter of god interacted with the world and matter of people exposed the difficulties with dualism, problems which plague thoughtful believers to this day as they try to reconcile a scientific perspective with religious faith.

Jonathan Miller in Atheism: A Rough History of Disbelief suggests that the first modern philosopher to seriously challenge the basis of the existing religious orthodoxies was Thomas Hobbes (1588-1679). He advocated ‘monism’, the idea that only one kind of stuff exists, and that stuff is what we see as matter. This ruled out dualism, especially other kinds of non-material entities like the soul and god. Although Hobbes’s book Leviathan (1651) advocated a strict materialism of both human nature and knowledge, he was not really an atheist and might better be classified as one of the first modern deists, someone who allows for the existence of some prime mover who set the universe in motion but then does not interfere subsequently.

The official climate in Hobbes’ time was still strongly discouraging of any forms of skepticism and people had to be cautious about going against these norms of belief. Perhaps as a result of the alarm caused to the supporters of religion by the spread of the kind of views expressed by Hobbes, in 1694 the British parliament had a long debate and passed a bill that advocated the death penalty for blasphemy if anyone should deny divinity. Early drafts of the bill even included atheism as grounds for execution, although that was not included in the final law that was passed. But it gives us a sense of the degree of public opprobrium that one risked if one espoused any form of heterodoxy.

One can see the strong appeal of deism for freethinkers in those times. Deism allowed people to formally genuflect to god and maintain a stance of official belief in god while allowing the free reign of their intellect in all other matters, especially science, since in the deist framework god was never invoked to explain anything other than the original creation of the universe and its subsequent laws and maintained a strict hands-off policy after that. Since atheism could be grounds for persecution and punishment and even execution, it seems reasonable to suppose that many deists of those days may well have been closeted atheists.

The fact that many of the prominent leaders of the American revolution (such as George Washington, Thomas Jefferson, Benjamin Franklin, Ethan Allen, James Madison, and James Monroe) were deists and had no trouble advocating the constitutional separation of church and state makes sense in the light of this historical context. They were rebelling against the restrictive entanglements of religion with government back in England, while trying to be not too far ahead of their own populace in terms of religion. After all, there have always been influential religious zealots in America, some who even went to the extent of seeking out and executing witches, and it would not have been not politically expedient to disavow god altogether. Still, it is quite amazing how sophisticated in such matters the American political leadership of that time was, compared to the present day when leaders publicly express a bizarre belief that god is actually in personal contact with them, and some even do not accept the theory of evolution.

While Hobbes with his theory of monism laid the philosophical basis for modern atheism, Miller argues that he cannot be truly identified as the first atheist. Neither could philosopher David Hume (1711-1776) who followed in Hobbes’ footsteps. But both were definitely anti-religious and flirted publicly with atheism and it would not be surprising if they were privately so, since both dropped hints that they suspected that most people were a lot less pious than they publicly let on.

David Hume, writing in his The Natural History of Religion chapter XII (1757), suspected that there was a great deal of hypocritical piety among his contemporaries:

We may observe, that, notwithstanding the dogmatical, imperious style of all superstition, the conviction of the religionists, in all ages, is more affected than real, and scarcely ever approaches, in any degree, to that solid belief and persuasion, which governs us in the common affairs of life. Men dare not avow, even to their own hearts, the doubts which they entertain on such subjects: They make a merit of implicit faith; and disguise to themselves their real infidelity, by the strongest asseverations and most positive bigotry. But nature is too hard for all their endeavours, and suffers not the obscure, glimmering light, afforded in those shadowy regions, to equal the strong impressions, made by common sense and by experience. The usual course of men’s conduct belies their words, and shows, that their assent in these matters is some unaccountable operation of the mind between disbelief and conviction, but approaching much nearer to the former than to the latter.

One gets the impression that while the people of Hume’s time may not have publicly expressed disbelief, there were a lot of knowing winks and nudges exchanged when public piety was encountered.

I think that Hume is describing many people today as well.

Next in this series: The first published atheist.

POST SCRIPT: After Fredo

The Department of Justice, like the IRS, can function effectively only if perceived as above partisan politics. This is because unlike most other government agencies, they can wield great power over individuals and so any action they take has to be seen as not serving a partisan agenda.

Alfredo Gonzales instigated and presided over the almost complete politicization of the Justice Department, making it serve as an extension of the White House, and his welcome departure is being accompanied by calls that he be replaced by someone who will restore some semblance of independence and integrity to that institution.

I am not sanguine that this will happen and am not sure why people have such high hopes. The Bush administration has had a consistent track record of appointing as partisan a political hack as they can get away with to all positions. Right now, the only constraint on its excesses is that the Democrats have to approve the nominee, but I fully expect that the nominee will be someone who they think they can just squeak by the approval process.

This is one of those predictions where I hope I am wrong.

Reflections on the working poor

(Text of the talk given by me to the first year class at the Share the Vision program, Severance Hall, Cleveland, OH on Friday, August 24, 2007 at 1:00 pm. The common reading for the incoming class was David Shipler’s book The Working Poor: Invisible in America.)

Welcome to Case Western Reserve University! The people you will encounter here are very different from the people described in David Shipler’s book The Working Poor: Invisible in America and I would like to address the question: what makes that difference?

Two answers are usually given. One is that we live in a meritocracy, and that we got where we are because of our own virtues, that we are smarter or worked harder or had a better attitude and work ethic than those who didn’t make the cut. I am sure that everyone in this auditorium has been repeatedly told by their family and friends and teachers that they are good and smart, and it is tempting to believe it. What can be more gratifying than to be told that one’s success is due to one’s own ability and efforts? It makes it all seem so well deserved, that there is justice in the world.

Another answer is that luck plays an important role in educational success. I suspect that most of us were fortunate enough to be born into families that had most, if not all, of the following attributes: stable homes and families, good schools and teachers, safe environments, good health, and sufficient food and clothing. Others are not so fortunate and this negatively affects their performance in school.

But there is a third possibility that is not often discussed and that is that the educational system has been deliberately designed so that large numbers of people end up like the people in the book, people who not only have failed but more importantly have learned to think of themselves as failures.

This idea initially seems shocking. How can we want people to fail? Aren’t our leaders always exhorting everyone to aim high and succeed in education? But let’s travel back in time to the beginnings of widespread schooling in the US. In those early days, schooling was unplanned and focused more on meeting the needs of the learner and less on meeting the needs of the economy.

Recall that this was the time when the so-called robber barons were amassing huge personal wealth while the workers were having appalling working conditions. There was increasing concern that as the general public got more educated, more and more would realize and resent this unequal distribution of wealth.

This fear can be seen in an 1872 Bureau of Education document which speaks about the “problem of educational schooling”, according to which, “inculcating knowledge” teaches workers to be able to “perceive and calculate their grievances,” thus making them “more redoubtable foes” in labor struggles. (John Taylor Gatto, The Underground History of US Education (2003) p. 153, now available online.)

This was followed by the report in 1888 that said, “We believe that education is one of the principal causes of discontent of late years manifesting itself among the laboring classes.” (Gatto, p. 153)

The rising expectations of the general public had to be dampened and this was done by creating an education system that would shift the focus away from learning and more towards meeting the needs of the economy. And the economy then, like now, does not need or want everyone to be well educated.

After all, think what would happen if everyone got a good education and college degrees? Where would we get enough people like those in the book, willing to work for low wages, often with little or no benefits, at places like Wal-Mart so that we can buy cheap goods? Or at McDonalds so that we get cheap hamburgers? Or as cleaning staff at restaurants and hotels so that we can eat out often? Or in the fields and sweatshops so that we can get cheap food and clothes? As the French philosopher Voltaire pointed out long ago: “The comfort of the rich depends upon the abundance of the poor.”

One of the most influential figures in shifting education to meet the needs of the work force was Ellwood P. Cubberley, who wrote in 1905 that schools were to be factories “in which raw products, children, are to be shaped and formed into finished products… manufactured like nails, and the specifications for manufacturing will come from government and industry.” (Gatto, footnote on page 39 in the online edition of the book.)

He also wrote: “We should give up the exceedingly democratic idea that all are equal and that our society is devoid of classes.”

The natural conclusion of this line of reasoning was spelled out in a speech that Woodrow Wilson gave in 1909, three years before he was elected President of the United States. He said: “[W]e want to do two things in modern society. We want one class to have a liberal education. We want another class, a very much larger class of necessity, to forgo the privilege of a liberal education and fit themselves to perform specific difficult manual tasks.” (The Papers of Woodrow Wilson, vol. 18, 1908-1909, Princeton University Press, Princeton NJ, 1974, p. 597.)

So a third possible answer to why all of us are different from the people described in Shipler’s book is that the educational system is designed to make sure that only a small percentage (us) will succeed and a much larger percentage (like the people in the book) will fail.

But it is not enough to simply exclude people from success as they will resent it and rebel. After all, all people have had dreams of a good life. As Shipler writes on page 231: “Virtually all the youngsters I spoke with in poverty-ridden middle schools wanted to go on to college. . .Their ambitions spilled over the brims of their young lives.” They dreamed of becoming doctors, lawyers, nurses, archeologists, and policemen. But those dreams have to be crushed to meet the needs of the economy. But crushing people’s dreams carries risks.

The poet Langston Hughes warned what might happen in his poem A Dream Deferred:

What happens to a dream deferred?
Does it dry up 

like a raisin in the sun? 

Or fester like a sore– 

And then run? 

Does it stink like rotten meat? 

Or crust and sugar over– 

like a syrupy sweet?
Maybe it just sags 
like a heavy load.
Or does it explode?

In order to prevent people with crushed dreams from exploding, you have to make them resigned to their fate, to think it is their own fault, to consider themselves failures and unworthy. How do you do that? By making them repeatedly experience failure and discouragement so that by the time they reach high school or even middle school, their love for learning has been destroyed, they have been beaten down, their hopes and dreams crushed by being told repeatedly that they are lazy and no good, so that should not aim high and instead should they think of themselves as so worthless and invisible that it does not even matter if they show up for work or not.

And we have done that. Currently we have an educational system in which people do primarily blame themselves for failure. As Shipler writes in his preface: “Rarely are they infuriated by their conditions, and when their anger surfaces, it is often misdirected against their spouses, their children, or their co-workers. They do not usually blame their bosses, their government, their country, or the hierarchy of wealth, as they reasonably could. They often blame themselves, and they are sometimes right.”

So does this mean that everything that our proud parents and teachers have told us about how smart we are is false? No, that is still true. What is false is the widespread belief that all the other people are poor because they are intrinsically stupid or lazy or incompetent.

You are now in a place that values knowledge and inquiry and has the resources to satisfy your curiosity about almost anything. And all this knowledge is freely shared with you, limited only by your own desire to learn. But all that knowledge that you can gain should not to be used to distance yourself even further from those who have not been as fortunate as you, or to think of yourself as superior to them.

All this knowledge is given to you so that you can become a better steward of the planet, so that you will try and create the kind of world where more people, in fact all people, can live the same kind of life that you will lead.

POST SCRIPT: Bye, Bye, Fredo

Alberto Gonzales surely must rank as a front-runner for the worst Attorney General ever, despite strong competition from people like President Nixon’s John Mitchell. In fact, the administration of George W. Bush has strong candidates for the worst ever nods in all the major categories: President, Vice President, Secretary of Defense, Secretary of State, and National Security Advisor.

Truly this is an administration that can only be described in superlatives.

The history of western atheism-1: The ancient origins

In the BBC4 TV program Atheism: A Rough History of Disbelief, host Jonathan Miller states flatly right at the beginning, “This series is about the disappearance of something – religious faith. . . The history of the growing conviction that god does not exist.”

(The full three hour, three-part series can be seen starting at the beginning here. The price you pay for it being on YouTube is that each hour is chopped up into six ten-minute segments in order to meet the time restrictions. But the video and sound quality are excellent.)

Miller did a nice job of summarizing the rise and fall and rise again of freethinking. Strictly speaking, his is a survey of atheism just in the western world. In the eastern world of two millennia ago, the widespread acceptance of Confucianism, which placed very little emphasis on a god, and Buddhism, which required no belief in god, suggests that atheism was not perceived as negatively as in the west.

The Miller documentary is structured quite traditionally. It is long on voice-over narration by Miller as he walks through various imposing historical churches, museums, and other buildings and gazes upwards at portraits and statues of the people he is talking about, interspersed with interviews with scholars. It is Miller talking to the viewer in an informal, chatty way, interweaving the history of disbelief with his own journey to a comfortable atheism. But what it lacks in drama and glitz, it more than makes up in the low-key, understated charm that is characteristic of good BBC documentaries. The second and third hours are especially good as the pace picks up.

Miller points out that many of the early Greeks philosophers were freethinkers, highly skeptical of the idea of a god. It is interesting that in those very early days, the Greeks had a much more sophisticated view of god and religion than we have even now, and the program provides many wonderful quotes about religion and god as evidence.

Epicurus (341-271 BCE) posed the essential and, to my mind, the ultimate contradiction that believers in god face: How to explain the existence of evil.

Is god willing to prevent evil but not able? Then he is not omnipotent.
Is he able but not willing? Then he is malevolent.
Is god both able and willing? Then whence cometh evil?
Is he neither able nor willing? Then why call him god?

These questions are usually avoided by religious people by invoking ignorance, the ‘mysterious ways clause’, that says that god has reasons for allowing evil to occur which we are unable to comprehend, although it is not clear how they know that god does not want them to understand. But as the French philosopher Voltaire once said, “The truths of religion are never so well understood as by those who have lost the power of reasoning.”

Lucretius (circa 99-55 BCE) proposed a theory of the origins of religion and articulates an early formulation of naturalism: “Fear is the mother of all gods. Nature does all things spontaneously by herself without their meddling.”

Cicero (106-43 BCE) points out that it is obvious that there is no god and that much public piety is hypocritical and based on fear. “In this subject of the nature of the gods, the first question is do the gods exist or do they not? It is difficult, you will say, to deny that they exist. I would agree, if we were arguing the matter in a public assembly. But in a private discussion of this kind, it is perfectly easy to do so.”

Seneca (circa 4 BCE-65 CE) argues that belief in god is a fraud perpetrated on the public in order to sustain a ruling class: “Religion is regarded by the common people as true, by the wise as false, and by rulers as useful.”

It is interesting that even though the climate for freethinking was better in the time of the early Greeks, Cicero’s quote illustrates that people who were skeptical about the existence of god still had to be discreet for fear of repercussions, something that has continued to this day, explaining why so many atheists still are fearful about proclaiming their disbelief publicly.

The conversion to Christianity by the Roman Emperor Constantine (280-337 CE) led to the rise of Christianity being the favored religion of the Roman Empire and the beneficiary of state patronage. It also resulted in forcing freethinkers to lay low in society, and the suppression of those early Greek writings that supported atheism. Heretics were persecuted and this practice became institutionalized with the various forms of the Inquisition by the church beginning around the 12th century. Recall that most ‘heretics’ were not atheists, but religious people who had views different from that of Catholic orthodoxy. This effectively led to the forcing of specific religious beliefs on people, requiring public affirmations of religious orthodoxy, a practice that has remained in force to this day as we see with politicians routinely spouting pieties.

The arrival of the renaissance around 1500 CE signaled a new time. The birth of the new sciences with Copernicus and Galileo and Newton was coupled with the rise of Arab scholars who had preserved and now resurrected those early Greek skeptical writings. All this led to a flowering of new kinds of thinking. But those early days of modern science did not by themselves lead to a rise of disbelief or atheism. After all, those well-known scientists were all pious people, not skeptics. They simply felt that it was inconceivable that science would reveal anything that was incompatible with god’s work in the world so they did not seem to suffer any personal anxieties of disbelief about where their research would lead. They felt that any seeming contradiction between scientific knowledge and the Bible had to be due to a misinterpretation of the Bible. So they were far more sophisticated than current day Biblical literalists who lay the blame for the same conflicts at the feet of faulty science, not religious texts.

When Galileo was asked by the church to explain the conflict between his views and the Bible, he said, quite reasonably, that the church had no choice but to agree with whatever knowledge science was producing. He said it would be “a terrible detriment for the souls if people found themselves convinced by proof of something that it was made a sin to believe.” (Almost Like a Whale, Steve Jones, 1999, p. 26) Of course, the Catholic Church did not heed his views, putting him under house arrest, and it is amazing that it was only as late as 1984 that they officially apologized for their treatment of him.

So even during the period called the ‘enlightenment’ (roughly 1500-1800 CE), there continued to be a climate where freethinking was discouraged, with severe penalties for blasphemy. The Inquisition was also gaining strength around this time, forcing freethinkers to suppress public disavowals of god or even of Christian orthodoxy. In this climate, the re-emergence of skeptical beliefs necessarily had to be very cautious and incremental.

Next in this series: The beginnings of modern atheism.

POST SCRIPT: Question: What is a non sequitur?

Miss Teen USA 2007 finalist provides an illustration.

The journey to atheism

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from August 8, 2005, edited and updated.)

In a comment to a previous post, Jim Eastman said something that struck me as very profound. He said:

It’s also interesting to note that most theists are also in the game of declaring nonexistence of deities, just not their own. This quote has been sitting in my quote file for some time, and it seems appropriate to unearth it.

“I contend we are both atheists – I just believe in one fewer god than you do. When you understand why you reject all other gods, you will understand why I reject yours as well.” – Stephen F. Roberts

The Roberts quote captures accurately an important stage in my own transition from belief to atheism. Since I grew up as a Christian in a multi-religious society and had Hindu, Muslim, and Buddhist friends, I had to confront the question of how to deal with other religions. My answer at that time was simple – Christianity was right and the others were wrong. Of course, since the Methodist Church I belonged to had an inclusive, open, and liberal theological outlook, I did not equate this distinction with good or evil or even heaven and hell. I felt that as long as people were good and decent, they were somehow all saved, irrespective of what they believed. But there was no question in my mind that Christians had the inside track on salvation and that others were at best slightly misguided.

But as I got older and reached middle age, I found the question posed by Roberts increasingly hard to answer. It became clear to me that when I said I was a Christian, this was not merely a statement of what I believed. Implicitly I was also saying, in effect if not in words, that I was not a Hindu, Muslim, Jew, Buddhist, etc. As in the quote above, I could not satisfactorily explain to myself the basis on which I was rejecting those religions. After all, like most people, I believed in my own religion simply because I had grown up in that tradition. I had little or no knowledge of other religions and hence had no real grounds for rejecting them. In the absence of a convincing reason for rejection, I decided to just remove myself from any affiliation whatsoever, and started to consider myself a believer in a god that was not bound by any specific religious tradition.

But when one is just a free-floating believer in god, without any connection to organized religion and the comforting reinforcement that comes with regular worship with others, one starts asking difficult questions about the nature of god and the relationship of god to humans for which the answers provided by organized religious dogma simply do not satisfy. When one is part of a church or other religious structure one struggles with difficult questions (suffering, the virgin birth, the nature of the Trinity, original sin, the basis for salvation, etc.) but those difficulties are addressed within a paradigm that assumes the existence of god, and thus always provides, as a last option, saying that the ways of god are enigmatic and beyond the comprehension of mere mortals. People can be urged to accept things on the basis of faith as if it were virtuous to do so.

But when I left the church, I started struggling with different questions such as why I believed that god existed at all. And if she/he/it did exist, how and where and in what form did that existence take, and what precisely was the nature of the interaction with humans?

I found it increasingly hard to come up with satisfactory answers to these questions and I remember the day when I decided that I would simply jettison the belief in god altogether. Suddenly everything seemed simple and clear. It is very likely that I had arrived at this conclusion even earlier but that my conscious mind was rejecting it until I was ready to acknowledge it. It is hard, after all, to give up a belief that has been the underpinning of one’s personal philosophy since childhood. But the feeling of relief that accompanied my acceptance of non-belief was almost palpable and unmistakable, making me realize that my beliefs had probably been of a pro forma sort for some time.

Especially liberating to me was the realization that I did not have to examine all new discoveries of science to see if they were compatible with my religious beliefs. I could now go freely wherever new knowledge led me without wondering if it was counter to some religious doctrine.

Another benefit of not believing is that one could be more consistent in how one interpreted events. For example, religious survivors of some calamity are often quick to claim that god must have saved them from harm while refusing to acknowledge that, by that logic, god must have wanted all the others to perish. The media reinforces this kind of silly thinking. Jon Stewart on his Daily Show skewered how the media quickly jumped on the “It’s a miracle!” bandwagon to “explain” the lack of any fatalities when an Air France plane crashed in Toronto in 2005. There was a perfectly natural and even admirable alternative explanation for this, which was the calmness and competence of the crew that managed to get everyone off the plane less than two minutes after the crash. And yet the media, rather than giving credit to all the emergency personnel involved, quickly started playing the “miracle” theme.

As Stewart said: “The only thing that was a miracle in that situation was the lightening that hit the plane, that was the act of God. If anything, God was trying to kill these people. His plan was foiled by the crew’s satanic competence.”

There was a time when I too would have credited god for saving the people in the plane crash while not laying the blame on him for people who died in other plane crashes. Now those kinds of contradictions are glaringly obvious.

A childhood friend of mine who knew me during my church-religious phase was surprised by my change and reminded me of two mutual friends who, again in middle age, had made the transition in the opposite direction, from atheism to belief. He asked me if it was possible that I might switch again.

It is an interesting question to which I, of course, cannot know the answer. My personal philosophy satisfies me now but who can predict the future? What seems clear to me is that the standard answers provided by religion that satisfied me once will not satisfy me anymore. I have a much higher standard of evidence. But while conversions from atheism to belief and vice versa are not uncommon, I am not sure how common it is for a single person to make two such U-turns and end up close to where they started. It seems like it would be a very unlikely occurrence.

Waiting for the Rapture

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from May 9, 2005, edited and updated.)

I am a huge fan of the English comic writer P. G. Wodehouse, especially of his Jeeves and Wooster books. The plots are pretty much the same in all the Jeeves stories but the smoothness of Wodehouse’s writing, his superb comic touch, and his brilliant choice of words make him a joy to read. Even though I have read all of the Jeeves books many times and know all the plots, they still have the ability to make me laugh out loud.

In a typical Jeeves story, the hapless Bertie Wooster is invariably at some point trapped in a fast moving series of events that swirl around him and are beyond his control, pulling him in all directions, none of them promising good outcomes for him, before Jeeves ingeniously rescues him and provides happy endings all around. But often, when the chaos is at its height and Bertie feels completely overwhelmed, he would say that he “felt like he was living in the Book of Revelations.”

If you read the Book of Revelations (the last book of the Biblical New Testament, also called “The Revelation of John”) you will see what Bertie means. It is for the most part a bizarre series of visions involving strange animals, angels, stars crashing into the ground, the sun getting eaten up, fires, plagues, and mass killings that would delight special effects creators, if it were ever to be made into a film.

When I was studying to become a lay preacher in the Methodist church, we pretty much gave this weird book a miss, treating it as one might a dotty uncle who has to be invited to every family function, but whom you hope will not make a scene and wish no one would notice and ask about him. We studied mainly the Gospels that focused on the life and teaching of Jesus, the Acts of the Apostles, some of the letters by Paul, some of the Old Testament prophets, church and biblical history, and theology. We pretty much ignored the Book of Revelations. It was just too far out there.

So it is somewhat amazing to me that it is this book that is driving much of the new militant Christianity, while the Gospels and the actual teachings of Jesus have faded into the background. And the idea that seems to have gripped the imagination of many such Christians in the US is that of the Rapture, associated with the second coming of Jesus which signals the end of the world.

Much of the basic beliefs about the coming of the Rapture come from the letters written by Paul to various communities, but the full apocalyptic vision of the Rapture is found in Revelations. This book is the source of much cryptic language and symbolism that enables people to look for clues as to when the Rapture will occur, what are the signs of its imminence, and how to identify the good and bad people. Like the writings of Nostradamus, the “predictions” are vague enough to allow for endless speculations and to “explain” anything. It also has enough numbers to keep numerologists busy for millennia trying to interpret their meanings. The numbers six, seven, and twelve seem to have special significance.

(Incidentally, there is a huge internet industry dealing with the Rapture and speculations about it are rampant. One such set of speculations deals with the identity of the “Antichrist” (who seizes power for a short time after the Rapture before being vanquished), and nominees for that post include Prince Charles and Bill Clinton. See also the Rapture Index which calculates (along the lines of the Dow Jones Index) a number to give a measure of how close we are to the Rapture. Currently the number stands at 161. This is below the 2001 peak of 182 but any number above 145 falls into the highest category, labeled as “fasten your seat belts,” meaning that the signs are favorable for the Rapture happening any time.)

As far as I can tell, popular belief about the Rapture (as opposed to serious theology about it) is that it is associated with the second coming of Jesus and marks the moment when true believers in Christ (both dead and living), will be taken up to heaven to join him. It will be a sudden event, occurring without warning. People who are saved (and whose names have been “recorded” from the beginning of time) will be taken up instantaneously and disappear, leaving just their clothes behind. So if you happen to be with a group of people and several of them suddenly vanish from your sight, leaving their clothes and shoes in a little pile on the ground, that means the Rapture has occurred and that you, personally, have not made the cut.

Up to this point, since I have a live-and-let-live philosophy, the Rapture sounds fine. If true believers are taken away to lead blissful lives somewhere other than the Earth, leaving the rest of us behind, I have no problem with that. I wish them all happiness in their eternal life as the rest of us somehow muddle through on this Earth without them. Actually, life on Earth might actually be a whole lot better without out all these Rapturites around waiting impatiently for the end times. Clearly there will be some temporary disruptions in life as new people will have to be found to do the jobs that those Raptured away used to do, but these do not seem to be insurmountable problems since some estimates put the number of people who will be Raptured as low as 144,000 (another number that appears in Revelations).

But that is not apparently how it works. Those left behind are not also left alone, unfortunately. We are not to be kept busy merely distributing all the clothes left behind to various Goodwill stores. Instead we are to be victims of a massive and gruesome slaughter, with huge rivers of blood flowing everywhere, before everything comes to an end. The book of Revelations speaks of the flowing blood rising to the height of a horse’s bridle for a radius of 200 miles. (Since I enjoy mathematical estimation problems, I briefly toyed with the idea of estimating how many corpses it would take to create this much blood, but simply could not muster the enthusiasm for this straightforward but macabre task. But it would make for a nifty little homework problem in those religious schools that teach about the Rapture seriously.)

It is hard to estimate how many people take this idea of the Rapture seriously but given the numbers claimed by the Dominionist movement (around 30 million) it could be quite large. The twelve sequential novels of the Left Behind series by Tim LaHaye and Jerry B. Jenkins (which weave a fictional tale around the Rapture) claim a combined readership of 42 million. Of course, many in that number will be repeat buyers of the series and not all may be believers in the underlying message, but the numbers are still impressive. (Note that LaHaye is a co-founder with the late Jerry Falwell of the Moral Majority and works at Falwell’s Liberty University in Virginia.)

I haven’t actually read the Left Behind books myself or seen the film based on them (with all the books that I would really like to read, I just can’t see myself reading a million words of Rapture-based fiction), but Gene Lyons has a highly entertaining review of all the books and their message in the November 2004 issue of Harper’s Magazine. He says that the “books portray Midwestern suburbanites and born-again Israeli converts as Warrior Jesus’ allies in an apocalyptic struggle against a U.N.-anointed “World Potentate,” who looks “not unlike a younger Robert Redford” and speaks the language of science and liberal internationalism.”

The sins for which people are fingered to be slaughtered at the end of the world are sexual sins (fornication, homosexuality) or those of apostasy and blasphemy. Once again, it seems as if the only sins that these kinds of Christians care about are those involving sex and violations of religious orthodoxy. Swindling retirees out of their life savings, depriving people of health care, making people work in sweatshops, stealing from old and poor people whatever they have, cheating on your taxes, beating your spouse and children, being abusive to ones employees, seemingly are not things which automatically disqualify you from being taken up at the Rapture, but take one wrong step on sexual and doctrinal issues and you are toast (literally).

Interestingly though, Barbara R. Rossing in her book The Rapture Exposed says that the particular form of the apocalyptic vision that seems so appealing to many American Christians these days was originated quite recently, by a nineteenth century Scottish evangelist named John Darby and owes its origins to turmoil over Darwinism. Lyons says that “Rossing argues persuasively that certain people are attracted to Darby’s dispensationalist system with its Rapture theology because it is so comprehensive and rational – almost science-like – a feature that made it especially appealing during battles over evolution during the 1920s and 1930s.”

So we return once again with Darwin and evolution in the cross hairs of the evangelical movement. It is interesting to me how these two strands of human thought (science and religion) keep butting up against each other. Rossing’s thesis sheds some more light on why evolutionary theory seems to be such a burr under the saddle for evangelical Christians, driving them to furious opposition in ways that other scientific beliefs do not.

What makes us change our minds?

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from March 28, 2005, edited and updated.)

In an earlier post, I described the three kinds of challenges teachers face. Today I want to discuss how teachers might deal with each case.

On the surface, it might seem that the first kind of challenge (where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way) is the easiest one. After all, if students have no strong beliefs or prior knowledge about what is being taught, then they should be able to accept the new knowledge more easily.

That is true, but the ease of acceptance also has its downside. The very act of not caring means that the new knowledge goes in easily but is also liable to be forgotten easily once the course is over. In other words, it might have little lasting impact. Since the student has little prior knowledge in that area, there is little in the brain to anchor the new knowledge to. And if the student does not care about it one way or the other, then no effort will be made by the student to really connect to the material. So the student might learn this material by mostly memorizing it, reproduce it on the exams, and forget it a few weeks later.

The research on the brain indicates that lasting learning occurs when students tie new knowledge to things they already know, when they integrate it with existing material. So teachers of even highly technical topics need to find ways to connect it with students’ prior knowledge. They have to know their students, what interests them, what concerns them, what they care about. This is why good teachers tie their material in some way to stories or topics that students know and care about or may be in the news or to controversies. Such strategies tap into the existing knowledge structures in the brain (the neural networks) and connect the new material to them, so that it is more likely to ‘stick.’

The second kind of challenge is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. A teacher who does not take these existing beliefs into account when designing teaching strategies is likely to be wasting her time. Because these beliefs are so strongly, but unconsciously held, they are not easily dislodged or modified.

The task for the teacher in this case is to make students aware of their existing knowledge structures and the implications of them for understanding situations. A teacher needs to create situations (say experiments or cases) and encourage students to explore the consequences of the their prior beliefs and see what happens when they are confronted by these new experiences. This has to be done repeatedly in newer and more enriched contexts so that students realize for themselves the existence and inadequacy of their prior knowledge structures and become more accepting of the new knowledge structures and theories.

In the third case, students are consciously rejecting the new ideas because they are aware that it conflicts with views they value more (for whatever reason). This is the situation with those religious people who reject evolutionary ideas because they conflict with their religious beliefs. In such cases, there is no point trying to force or browbeat them into accepting the new ideas.

Does this mean that such people’s ideas never change? Obviously not. People do change their views on matters that they may have once thought were rock-solid. In my own case, I know that I now believe things that are diametrically opposed to things that I once thought were true, and I am sure that my experience is very common.

But the interesting thing is that although I know that my views have changed, I cannot tell you when they changed or why they changed. It is not as if there was an epiphany where you slap your forehead and exclaim “How could I have been so stupid? Of course I was wrong and the new view is right!” Rather, the process seems more like being on an ocean liner that is turning around. The process is so gentle that you are not aware that it is even happening, but at some point you realize that you are facing in a different direction. There may be a moment of realization that you now believe something that you did not before, but that moment is just an explicit acknowledgment of something that that you had already tacitly accepted.

What started the process of change could be one of many factors – something you read, a news item, a discussion with a friend, some major public event – whose implications you may not be immediately aware of. But over time these little things lodge in your mind, and as your mind tries to integrate them into a coherent framework, your views start to shift. For me personally, I enjoy discussions of deep ideas with people I like and respect. Even if they do not have any expertise in this area, discussions with such people tend to clarify one’s ideas.

I can see that process happening to me right now with the ideas about the brain. I used to think that the brain was quite plastic, that any of us could be anything given the right environment. I am not so sure now. The work of Chomsky on linguistics, the research on how people learn, and other bits and pieces of knowledge I have read have persuaded me that it is not at all clear that the perfectly-plastic-brain idea can be sustained. It seems reasonable that some structures of the brain, especially the basic ones that enable it to interpret the input from the five senses, and perhaps even learn language, must be pre-existing.

But I am not completely convinced of the socio-biological views of people like E. O. Wilson and Steven Pinker who seem to argue that much of our brains, attitudes, and values are biologically determined by evolutionary adaptation. I am also not convinced of the value of much of popular gender-related differences, such as that men are better than women at math or that women are more nurturing than men. That seems to me to be a little too pat. I am always a little skeptical of attempts to show that the status quo is ‘natural’ since that has historically been used to justify inequality and oppression.

But the works of cognitive scientists are interesting and I can see my views on how the brain works changing slowly. One sign of this is my desire to read widely on the subject.

So I am currently in limbo as regards the nature of the brain, mulling things over. At some point I might arrive at some kind of unified and coherent belief structure. And after I do so, I may well wonder if I ever believed anything else. Such are the tricks the brain can play on you, to make you think that what you currently believe is what is correct and what you always believed.

POST SCRIPT: The Church of the Wholly Undecided

Les Barker has a funny poem about agnosticism.

The purpose of teaching

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from March 24, 2005, edited and updated.)

I have been teaching for many years and encountered many wonderful students. I remember in particular two students who were in my modern physics courses that dealt with quantum mechanics, relativity, and cosmology.

Doug was an excellent student, demonstrating a wonderful understanding of all the topics we discussed in class. But across the top of his almost perfect final examination paper, I was amused to see that he had written, “I still don’t believe in relativity!”

The other student was Jamal and he is not as direct as Doug. He came into my office a few years after the course was over (and just before he was about to graduate) to say goodbye. We chatted awhile, I wished him well, and then as he was about to leave he turned to me and said hesitantly in his characteristically shy way: “Do you remember that stuff you taught us about how the universe originated in the Big Bang about 15 billion years ago? Well, I don’t really believe all that.” After a pause he went on, “It kind of conflicts with my religious beliefs.” He looked apprehensively at me, perhaps to see if I might be offended or angry or think less of him. But I simply smiled and let it pass. It did not bother me at all.

Why was I not upset that these two students had, after having two semester-long courses with me, still not accepted the fundamental ideas that I had been teaching? The answer is simple. The goal of my teaching is not to change what my students believe. It is to have them understand what practitioners in the field believe. And those are two very different teaching goals.

As I said, I have taught for many years. And it seems to me that teachers encounter three kinds of situations with students.

One is where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way. This is usually the case with technical or highly specialized areas (such as learning the symptoms of some rare disease or applying the laws of quantum mechanics to the hydrogen atom). In such cases, students have little trouble accepting what is taught.

The second type of situation is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. The physics education literature is full of examples that our life experiences conspire to create in people an Aristotelian understanding of mechanics. This makes it hard for them to accept Newtonian mechanics. Note that this difficulty exists even though the students have no particular attachment to Aristotle’s views on mechanics and may not have the faintest idea what they are. Overcoming this kind of implicit belief structure is not easy. Doug was an example of someone who had got over the first hurdle from Aristotelian to Newtonian mechanics, but was finding the next transition to Einsteinian relativistic ideas much harder to swallow.

The third kind of situation is where the student has strong and explicit beliefs about something. These kinds of beliefs, as in the case of Jamal, come from religion or politics or parents or other major influences in their lives. You cannot force such students to change their views and any instructor who tries to do so is foolish. If students think that you are trying to force them to a particular point of view, they are very good at telling you what they think you want to hear, while retaining their beliefs. In fact, trying to force or bully students to accept your point of view, apart from being highly unethical teaching practice, is a sure way of reinforcing the strength of their original views.

So Doug’s and Jamal’s rejection of my ideas did not bother me and I was actually pleased that they felt comfortable telling me so. They had every right to believe whatever they wanted to believe. But what I had a right to expect was that they had understood what I was trying to teach and could use those ideas to make arguments within those frameworks.

For example, if I had given an exam problem that required that the student demonstrate his understanding of relativistic physics to solve, and Doug had refused to answer the question because he did not believe in relativity or had answered it using his own private theories of physics, I would have had to mark him down.

Similarly, if I had asked Jamal to calculate the age of the universe using the cosmological theories we had discussed in class, and he had instead said that the universe was 6,000 years old because that is what the Bible said, then I would have to mark him down too. He is free to believe what he wants, but the point of the course is to learn how the physics community interprets the world, and be able to use that information.

Understanding this distinction is important because of the periodic appearance of demagogues who try to frighten people by asserting that colleges are indoctrinating students to think in a particular way. Such people seem to assume that students are like sheep who can be induced to believe almost anything the instructor wants them to and thus require legal protection. Anyone who has taught for any length of time and has listened closely to students will know that this is ridiculous. It is not that students are not influenced by teaching and do not change their minds but that the process is far more complex and subtle than it is usually portrayed, as I will discuss in the next posting.

My own advice to students is to listen carefully and courteously to what knowledgeable people have to say, learn what the community of scholars thinks about an issue, and be able to understand and use that information when necessary. Weigh the arguments for and against any issue but ultimately stand up for what you believe and even more importantly know why you believe it. Don’t ever feel forced to accept something just because some ‘expert’ or other authority figure (teacher, preacher, parent, political leader, pundit, or media talking head) tells you it is true. Believe things only when it makes sense to you and you are good and ready for it.

“I know this is politically incorrect but . . .”

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today’s one is from August 14, 2006, edited and updated.)

One of the advantages of being older is that sometimes you can personally witness how language evolves and changes, and how words and phrases undergo changes and sometimes outright reversals of meaning.

One of the interesting evolutions is that of the phrase “politically correct.” It was originally used as a kind of scornful in-joke within Marxist political groups to sneer at those members who seemed to have an excessive concern with political orthodoxy and who seemed to be more preoccupied with vocabulary than with the substance of arguments and actions.

Later it became used as a weapon against those who were trying to make language more nuanced and inclusive and less hurtful, judgmental, or discriminatory. Such people advocated using “disabled” instead of “crippled” or “mentally ill” instead of “crazy,” or “hearing impaired” instead of “deaf” and so on in an effort to remove the stigma under which those groups had traditionally suffered. Those who felt such efforts had been carried to an extreme, or just wanted to use words the way they always had, disparaged those efforts as trying to be “politically correct.”

The most recent development has been to shift the emphasis from sneering at the careful choosing of words to sneering at the ideas and sentiments behind those words. The phrase has started being used pre-emptively, to shield people from the negative repercussions of stating views that otherwise may be offensive or antiquated. This usage usually begins by saying “I know this is politically incorrect but….” and then finishes up by making a statement that would normally provoke quick opposition.

So you can now find people saying “I know this is politically incorrect but perhaps women are inferior to men at mathematics and science” or “I know this is politically incorrect but perhaps poor people are poor because they are stupid” or “I know this is politically incorrect but perhaps blacks are less capable than whites at academics.” The opening preamble is not only designed to make such statements acceptable, the speaker can even claim the mantle of being daring and brave, an outspoken and even heroic bearer of unpopular or unpalatable truths.

Take for example, a blurb by intelligent design creationist Jonathan Wells for his own book The Politically Incorrect Guide to Darwinism and Intelligent Design. The cover of the book says: “Darwin is an emperor who has no clothes— but it takes a brave man to say so. Jonathan Wells, a microbiologist with two Ph.D.s (from Berkeley and Yale), is that brave man.” There have been similar books that try this same linguistic maneuver, such as The Politically Incorrect Guide to Science and The Politically Incorrect Guide to Global Warming (and Environmentalism).

Brandishing the label of being ‘politically incorrect’ as a form of argument is silly, as is invoking the fact that one has a doctorate. It is actually a sign of weakness, indicating that one’s arguments cannot stand on their own. For example, physicists assume that all electrons are identical. We don’t really know this for a fact, since it is impossible to compare all electrons. The statement “all electrons are identical” is a kind of default position and, in the absence of evidence to the contrary, does not need to be supported by positive evidence. The assertion that “some electrons are heavier than others” is going to be dismissed in the absence of supporting evidence. Simply saying ” I know this is not politically correct but I believe some electrons are heavier than others and I have a PhD” does not make it any more credible. It merely makes you look pompous and self-aggrandizing.

Sentiments that would normally would be considered discriminatory, biased, and outright offensive if uttered without any supporting evidence are now protected from criticism by this preamble. It is then the person who challenges this view who is put on the defensive, as if he or she was some prig who unthinkingly spouts an orthodox view.

Fintan O’Toole of The Irish Times (May 5, 1994) noted this trend early and pithily said:

We have now reached the point where every goon with a grievance, every bitter bigot, merely has to place the prefix, “I know this is not politically correct but . . .” in front of the usual string of insults in order to be not just safe from criticism but actually a card, a lad, even a hero. Conversely, to talk about poverty and inequality, to draw attention to the reality that discrimination and injustice are still facts of life, is to commit the new sin of political correctness……… Anti-PC has become the latest cover for creeps. It is a godsend for every sort of curmudgeon or crank, from the fascistic to the merely smug.

Hate blacks? Attack positive discrimination – everyone will know the codes. Want to keep Europe white? Attack multiculturalism. Fed up with the girlies making noise? Tired of listening to whining about unemployment when your personal economy is booming? Haul out political correctness and you don’t even have to say what’s on your mind.

Even marketers are cashing in on this anti-PC fad, as illustrated by this cartoon.

Here’s a tip. Anyone who feels the need to invoke the “politically incorrect” trope as an indicator of his or her valor is probably trying to hide the weaknesses in their argument.

POST SCRIPT: Comparing the candidates

How do the presidential candidates compare when it comes to where they stand on the left-right and authoritarian-libertarian continua?

You can see for yourself, based on their positions on a range of issues.

I found it interesting (but not surprising) that every candidate of both parties (except for Democrats Dennis Kucinich and Mike gravel) ended up in the right-wing /authoritarian quadrant.

You can also answer the questions yourself and compare yourself to them. My scores put me in the deep southwest part in the left-libertarian quadrant, more so than Kucinich and Gravel.

These kinds of things are fun but should not be considered a serious analysis of political philosophies.