Soaps and Soap

For a very brief time in my life, about one week actually, I got hooked on daytime TV soap operas.

It happened in December of 1978. I had received a phone call that my father had died suddenly of a heart attack back in Sri Lanka. I was in graduate school in the US, far away from my family, and thus away from the kinds of support networks and rituals that help one get through such times of grief. I could not concentrate on my studies or reading or other things to distract my mind so turned for solace to watching TV all day, as so many do in such situations when seeking escapism through mindless activity.

In those pre-internet and early cable days, your TV choices were largely limited to just the three networks CBS, NBC, and ABC and during the day all three served up a diet of talk shows, game shows, and soap operas. Although I wasn’t at all interested at first, quite soon I was quite absorbed in the various stories that made up the soaps. For those not familiar with the genre, these daytime soap operas involve multiple intersecting story lines involving quite a large cast of characters of usually middle class or rich people, with a few low-lifes thrown in to spice things up. The tales involve love, jealousy, intrigue, adultery, murder, larceny, backstabbing, lying, cheating, and other strong human characteristics.

These programs can be quite addictive and develop faithful followings as can be seen from the longevity of soaps like Days of Our Lives, All My Children, The Young and the Restless and As the World Turns, all of which have lasted over three decades.

Although I stopped watching after a week, these shows gave me a greater appreciation for the riotously funny weekly prime-time sitcom Soap, which was a parody of the daytime soaps, and ran for four seasons during the years 1977-1981.

The basic story of Soap was that of the intersecting lives of two families, the Tates and the Campbells, where the two mothers Jessica Tate and Mary Campbell were sisters. The best way to describe Soap is as daytime soap opera on steroids. Where the daytime soaps stories proceeded excruciatingly slowly, with long pregnant pauses in the dialogue, lengthy meaningful looks, and dragged-out plot developments, Soap went at break-neck speed with plot twists occurring in rapid-fire succession. All the standard complex plotlines of the daytime soaps were present and then made even more extreme in Soap by adding outlandish things like UFOs, alien abductions, demon-possessions, guerillas, gangsters, blackmail, kidnappings, exorcisms, brainwashing by a religious cult (led by the Reverend Sun whose followers were called “the Sunnies”!) and so on. Storylines that would be sufficient for a full season on the regular soaps were crammed into just a few episodes of Soap. This breathless pace was compressed into weekly half-hour programs, each episode beginning in classic soap style with a voice-over announcer saying what had happened in previous episodes, and ending with a dramatic cliff-hanger, followed by the announcer hyping up the suspense for the episodes to come.

What really made Soap one of the funniest TV programs was clever writing coupled with one of the best ensemble casts ever put together, easily triumphing over those of the more-heralded Seinfeld or Friends casts. Katherine Helmond as the ditzy Jessica (whom men found irresistible) and Cathryn Damon as Mary were the anchors that held the two families (and the show) together as increasingly bizarre things happened all around them. Some of the funniest scenes were when the two were sitting around a kitchen table, each trying to bring the other up-to-date on the latest bizarre happenings in their families and, in a perverse way, competing to top each other’s stories.

Richard Mulligan as Bert Campbell (Mary’s working class husband) was superb in his physical comedy, his body and face seemingly made of rubber, responding spasmodically to his nervous energy. Billy Crystal (a newcomer then) appeared as Jodie (Mary’s son) in what may be the first portrayal of a gay person on TV that got laughs out of being gay while remaining a sympathetic character and avoiding becoming a caricature. Robert Guillaume as Benson, the sardonic back-talking butler for the rich Tates, was another actor who managed to take what might have become a stereotypical role (black servant of a rich white family) and infuse it with dignity and humor. In fact Crystal and Guillaume were perhaps the most sensible (or at least the least eccentric) of the entire Tate-Campbell menagerie.

Perhaps the most eccentric character was Bert’s son Chuck who always went around with his ventriloquist dummy Bob. Chuck acted like Bob was a real person and would hold conversations with him while Bob would insult everyone and leer at women. The humor arose because other members of the family also sometimes ended up treating Bob as a real person and speak and argue and get angry with him, while not holding Chuck responsible for Bob’s words. (It is an interesting thing to speculate as to what you would do if someone you knew acted like Chuck did. In order to spare his feelings, wouldn’t you also treat his dummy like a real person, even if you felt ridiculous doing so?) In one such scene, Chuck plans to go out on a date leaving Bob behind but Bob harangues him until Chuck agrees to take him along. When they both finally leave, Mary asks Bert (who have both been watching this) whether they shouldn’t get professional help for Chuck, to which Bert replies, “Chuck doesn’t need professional help, he should just learn to discipline Bob more.”

The reason for these fond reminiscences is that I just discovered that these old programs are now available on DVD and I have been watching them again. There is always a danger in doing these kinds of trips down nostalgia lane because one’s memories of old books, films and TV programs often make them seem better than they actually were. I was a little fearful that Soap would disappoint were but it passed the test handily. It is still laugh-out-loud funny.

The added bonus to watching on DVD is the absence of commercials. I also noticed how the opening and closing credits were more leisurely than they are now, allowing one to actually read the names of the actors and crew without distracting sidebar promos for other shows. The running time of each half-hour episode then was also 24 minutes and 30 seconds. I suspect that nowadays this has been reduced to allow for more commercial breaks.

There were other good TV comedies at that time, like M*A*S*H and Newhart, but I would not seek out DVDs of them the way I did with Soap.

Soap was a comedy classic and if you get the chance you should see it. And make sure you watch it in sequence.

POST SCRIPT: Class politics

Here’s another provocative clip from the 1998 film Bulworth (strong language advisory).

Language and Evolution

I have always been fascinated by language. This is somewhat ironic since I have a really hard time learning a new language and almost did not make it into college in Sri Lanka because of extreme difficulty in passing the 10th-grade language requirement in my own mother tongue of Tamil! (How that happened is a long and not very interesting story.)

But language fascinates me. How words are used, their origins, how sentences are structured, are all things that I enjoy thinking and reading about. I like playing with words, and enjoy puns, cryptic crosswords, and other forms of wordplay.

All this background is to explain why I recommend an excellent book The Power of Babel by John McWhorter, who used to be a professor of linguistics at the University of California, Berkeley but is now a fellow at the Manhattan Institute. In the book he discusses the complexity of language and points out that the evolution of language is very similar to that of biological life. He suggests that there was originally just one spoken, very primitive, language and as the people who spoke it fanned out across the globe, the various languages evolved as separated communities formed. And in the process the languages became more complex and sophisticated, and evolved intricate features in their vocabulary and grammar that now seem to have little functional purpose, in a manner very analogous to biological systems.

The precise origin of spoken language is hard to pin down. McWhorter argues that it probably arose with the evolution of the ability to form complex sounds and roughly synchronous with the arrival of homo sapiens about 150,000 years ago. Others have suggested a more recent date for the origins of language, about 12,000-15,000 years ago, but pinning this date down precisely is next to impossible given that spoken language leaves no traces. What we do know is that written language began about 5,000 years ago

McWhorter points out that purely spoken languages evolve and change very rapidly, resulting in an extremely rapid proliferation of language leaving us with the 6,000 or so languages that we have now. It was the origin of writing, and more importantly mass printing, that slowed down the evolution of language since now the fixed words on paper acted as a brake on further changes.

He also makes an important point that the distinction between standard and dialect forms of languages have no hierarchical value and is also a post-printing phenomenon. In other words, when we hear people (say) in rural Appalachia or in the poorer sectors of inner cities speak in an English that is different from that spoken by middle class, college-educated people, it is not the case that they are speaking a debased form of ‘correct’ or ‘standard’ English. He argues that dialects are all there is or ever was, because language was always mainly a local phenomenon. There are no good or bad dialects, there are just dialects.

We can, if we wish, bundle together a set of dialects that share a lot in common and call it a language (like English or French or Swahili) but no single strand in the bundle can justifiably lay any intrinsic claim to be the standard. What we identify as standard language arose due to factors such as politics and power. Standard English now is that dialect which was spoken in the politically influential areas near London. Since that area was then the hub of printing and copying, that version of language appeared in the written form more often than other forms and somewhere in the 1400s became seen as the standard. The same thing happened with standard French, which happened to be the dialect spoken in the Paris areas.

McWhorter points out that, like biological organisms, languages can and do go extinct in that people stop speaking them and they disappear or, in some cases like Latin, only appear in fossilized form. In fact, most of the world’s languages that existed have already gone extinct, as is the case with biological species. He says that rapid globalization is making many languages disappear even more rapidly because as people become bi-lingual or multi-lingual, and as a few languages emerge as the preferred language of commerce, there is less chance of children learning the less-privileged language as their native tongue. This loss in the transmission of language to children as their primary language is the first stage leading to eventual extinction. He points out that currently 96 percent of the world’s population speaks at least one of just twenty languages, in addition to their indigenous language. These languages are Chinese, English, Spanish, Hindi, Arabic, Bengali, Russian, Portugese, Japanese, German, French, Punjabi, Javanese, Bihari, Italian, Korean, Telugu, Tamil, Marathi, and Vietnamese and thus these are the languages most likely to survive extinction. It is noteworthy that the population of India is so large and diverse that seven of these languages originated there, and two others (English and Arabic) are also used extensively in that country.

He also points out that languages are never ‘pure’ and that this situation is the norm. Languages cross-fertilize with other languages to form language stews, so that language chauvinists who try to preserve some pure and original form of their language are engaged in a futile task. For example, of all the words in the Oxford English Dictionary, more than 99 percent were originally obtained from other languages. However, the remaining few that originated in Old English, such as and, but, father, love, fight, to, will, should, not, from turn out to be 62 percent of the words that are used most.

McWhorter is a very good writer, able to really bring the subject to life by drawing on everyday matters and popular culture. He has a breezy and humorous style and provides lots of very interesting bits of trivia that, while amusing, are also very instructive of the points he wishes to make. Regarding the ability of language to change and evolve new words, for example, he explains how the word ‘nickname’ came about. It started out as an ‘ekename’ because in old English, the word ‘eke’ meant also, so that an ‘ekename’ meant an ‘also name’ which makes sense. Over time, though, ‘an ekename’ changed to ‘a nekename’ and eventually to ‘a nickname.’ He gives many interesting examples of this sort.

Those who know more than one language well will likely appreciate his book even more than me. It is a book that is great fun to read and I can strongly recommend to anyone who loves words and language.

POST SCRIPT: Whipping up war frenzy

Jon Stewart show how it is done.

Harry Potter and the supernatural

The release of each new Harry Potter book or film bring out of the woodwork those religious people who are disturbed by them and decide they need to spam everyone and try to make money from it as well.

I received a spam email following the release of Deathly Hallows that recycled old warnings about the subversive nature of the books and asking me to buy a video to help combat the Potter menace. It said:
[Read more…]

The case against religion

In the previous post. I argued that more education did not necessarily lead to less religion and that if one thought that its net effect on humanity was negative, one needed to more actively campaign against it. But others disagree. Even those who accept that religion has done some truly evil things might argue that the good that it does compensates (at least partially) and merits preserving it. The mere fact that it is false, it might be claimed, should not be sufficient to cause us to undermine it. They could point to children believing in Santa Claus or the Tooth Fairy, that these are examples of false but benign beliefs. What is to be gained by destroying such innocent illusions?

But the reason that beliefs in Santa Claus or the Tooth Fairy are benign is because children are deliberately weaned away from such beliefs before they reach adolescence. If we did not deliberately do so, who knows what might happen? We might end up having wars with armies of the followers of Santa Claus battling with those of the Tooth Fairy. Having adults who are capable of causing great harm believing in magical false things is usually not benign. We are unfortunately all too aware of the truth of Voltaire’s assertion “Those who can make you believe absurdities can make you commit atrocities”.

In a comment to a previous post, Corbin Covault argued that even though religion may be a human construct, it can still serve important functions that merit preserving it. He draws a comparison with government:

One could argue that both [religions and governments] are human social constructions. Both manifest in institutions which may give some great sources of benefit to societies and individuals, but both have also been great sources of destruction and oppression. Would it be fair to say that the argument that the “militant atheist” makes that the world would be better off without _all_ religions analogous to the argument that the anarchist would make that the world would be better off without any governments?

I think that we will all agree that religion can be the source of many good things and of many bad things. We will undoubtedly disagree on whether the net result is positive or negative. But the key question is not how the balance sheet between good and evil comes out for any particular institution, but whether that institution is the only one that provides those benefits, so that we have no choice but to also tolerate the evils that accompany it. With religion, I have argued before that every benefit claimed for it can be provide by other existing sources. If we get rid of religion, while we will lose both the good and the bad, my point was that we can regain every good thing lost using other means and institutions, so in the end we need only lose the bad things caused by religion.

The question then becomes whether we can say the same thing for government. If the answer is yes, then we should undoubtedly get rid of government but currently it seems like the answer is no. We know that there always exists a tension between the existence of governments and individual liberty. But we strike a deal, accepting the restrictions on personal liberty as the price we pay for peace and the benefits of community living. We struggle to define the proper balance between freedom and order. Although we currently seem to need some institutions of government, we are not committed to any one form. We are free to change them if they prove to be evil. We tend to deplore dictatorships and admire democracies and no particular government has any claim to divine sanction. There may come a time when people feel that no government at all is better.

A similar good-evil comparison can be made for science. Science has given us great benefits but has also been responsible for some terrible evils. No scientist can avoid the fact that we are, to some extent, complicit in the many evils done in its name.

My own physics research involved studying what happens when a high-energy photon strikes a nucleus and produces a short-lived sub-nuclear particle called a pi-meson. There is no obvious link between this and any weapons system, but that is deceiving. The whole field of study in which it is embedded, nuclear physics, is an integral part of weapons research. It is not inconceivable that someone else will come along in the future and find that my small and seemingly innocent contribution to the field is important in developing a component of a deadly weapons system. If it happens, I cannot claim complete innocence. Although I may have not intended my work to serve evil ends, the fact that it has the potential to do so is inescapable. No scientist can ever have clean hands.

While it is impossible for scientists to have a perfectly clear conscience, scientists are usually able to figure out rationalizations to justify their actions because the immediate goals of scientific research are usually to benefit humanity. Even those scientists who deliberately choose to work on things that are clearly destructive (the development of agents for biological and chemical warfare or more powerful bombs) usually can find some reason to square their consciences, by appealing to in-group/out-group thinking (“The enemies of my country/race/religion may also be developing these weapons and so I am doing this in self-defense and to protect humanity against a greater evil.”).

This was the kind of thinking of many scientists who worked on the Manhattan project during World War II that resulted in the creation of the atomic bomb. There is no reason to think that they were any more evil than anyone else. In fact, I have met and spoken with Hans Bethe, the Nobel Prize winning physicist (for his work explaining the energy production in stars) who was the leader of the theory group on the project and instrumental in guiding and shepherding the many brilliant scientists who worked under him. Bethe struck me as a kind and gentle man, who after the war worked for peace and disarmament. Einstein was the same. Although they both advocated for the development of a nuclear weapons program prior to the war, and their own groundbreaking research was the basis for nuclear weapons, they were also consistently a voice for peace. And there is reason to think that the scientists working on the opposing German nuclear weapons program, led by another Nobel Prize winner Werner Heisenberg, were doing so for the same reasons.

It is possible for society to decide to make a judgment that science is too dangerous to continue and to shut it down almost completely. If governments refuse to provide funding for research and for agencies like the National Science Foundation and the National Institutes of Health, modern science as we know it would cease to exist, because we could no longer maintain a professional class of scientists. Of course, the spirit of scientific inquiry will remain and there will be amateur scientists whose thirst for knowledge will drive them along. But we have to remember that the emergence of the professional scientist, someone paid primarily to do research, is a relatively new invention. In England, such people only came into being around 1850, with Charles Darwin’s friends and colleagues Jacob Hooker and T. H. Huxley being two of the earliest. Darwin himself belonged to the earlier tradition of the amateur scientist who was independently wealthy enough to indulge what was essentially a hobby, or who had a job (clergyman or professor) that allowed them sufficient freedom and time to do so.

But in shutting down professional science, people would be aware that they would also be shutting themselves off from almost all the enormous benefits that science provides. And this is where the difference with religion arises. Although it can be argued that science and religion provide both benefits and evils, when it comes to science there is nothing else that we know that can be substituted to provide those benefits. We cannot keep the baby while throwing out the bathwater, so we make a Faustian bargain.

But in the case of religion, there is no benefit claimed by religion that cannot be provided by other institutions. The only real reason to continue supporting the idea of god is that it is true. Since there is no convincing empirical evidence at all in support of that proposition, religion becomes dispensable.

POST SCRIPT: Tortured logic

The Daily Show discusses the damage done to language by the ‘war on terror’.

The effect of education on religion

Voltaire was stinging in his criticisms of religion in general and Christianity in particular. He provided his own definition of a Christian as follows: “A good-natured, simple fellow; a true lamb of the fold, who, in the innocence of his heart, persuades himself that he firmly believes unbelievable things that his priests have told him to believe, especially those he cannot even imagine. Consequently, he is convinced that three x’s make fifteen, that God was made man, that he was hanged and rose to life again, that priests cannot lie, and that all who do not believe in priests will be damned without remission.”

Voltaire was being sarcastic when he made the statement that Christians are necessarily ‘good-natured’ because elsewhere he makes clear that he knows that religious people are capable of incredible evil. But he may have genuinely thought that one had to be simple (in the sense of naïve) to believe in god because he viewed the whole concept of god as requiring one to believe preposterous things. As he said: “The son of God is the same as the son of man; the son of man is the same as the son of God. God, the father, is the same as Christ, the son; Christ, the son, is the same as God, the father. This language may appear confused to unbelievers, but Christians will readily understand it.”

And to reiterate his view that to adopt religion involved the abandonment of reason, he said: “The truths of religion are never so well understood as by those who have lost the power of reasoning.” (Voltaire, Philosophical Dictionary (1764), taken from Jonathon Green, The Cassell Dictionary of Cynical Quotations.)

The authors of the current crop of atheist books have attacked religion head-on by showing how untenable the claims of religion are, and how antithetical to rational thought. I have argued before that there are no mitigating benefits for religion that cannot be obtained from other sources. Since people should have the right to believe anything they want, the practical question becomes: What is the best way of making the unappealing aspects of religion better known so that more people will voluntarily relinquish it?

Since the liberal intellectual tradition holds that education leads to critical thinking, the solution is thus seen to lie in more and better education, the idea being that this leads to more reasoning minds, which in turn will lead to greater skepticism towards beliefs that fail the tests of reason and evidence, and hence to the decline of religion.

But this may be too optimistic a view of the power of education. I am not so sanguine that education holds the key. I think Voltaire was wrong in his belief that only the unreasoning could believe in god. As I have repeatedly pointed out, smart people are quite capable of believing weird things and finding reasons to do so, provided the desire to do so is strong enough. So more education will not necessarily lead to less religion. In fact, a longitudinal study of 10,000 adolescents actually found the opposite effect, that those who did not go on to college had greater declines in attending services, in the importance or religion, and in disaffiliation from religion.

This result was not a surprise to me, despite the widespread critiques by some people that universities are liberal hothouses, indoctrinating students away from ‘traditional’ conservative values such as religion. As a teacher of many years, I have found laughably naïve the notion that college teachers have such power over student beliefs.

It is true that students are likely to encounter faculty who are, in general, less religious than the general public. An interesting analysis of religious beliefs in academia finds that “academics in the natural and social sciences at elite research universities are significantly less religious than the general population. Almost 52 percent of scientists surveyed identified themselves as having no current religious affiliation compared with only 14 percent of the general population.”

But this may not be decisive. As I had said earlier, to some extent, the more education one has, the more one is able to find sophisticated reasons to hold on to whatever one wants to believe. As Michael Shermer says in his book Why People Believe Weird Things (2002, p. 283): “Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.”

What more education (especially in college) does to student beliefs may depend on what the students’ prior inclinations are. For those who arrive already doubting, the discovery in college that they are not alone, that there exist like minded students who share their views, and the general willingness of academia to treat doubt and skepticism as positive traits, could well speed them along the path to greater non-belief.

But for those who are determinedly faithful, college could provide them with better tools to defend their beliefs. Reason cannot easily overcome the will to believe. As Jonathan Swift nicely put it: “You cannot reason a person out of a position he did not reason himself into in the first place.”

So up to a point, more of traditional education actually aids belief, because much of it focuses on information and skills rather than deep learning. The point at which more education leads to disbelief is when people start really looking closely at evidence for beliefs, start trying to integrate different areas of knowledge into a coherent worldview, and begin to get in the habit of making reasoned judgments using incomplete knowledge. This becomes more likely to occur when students do more research-like activities because then the ability to form and defend judgments based on data and evidence and reason becomes paramount.

We see this in the fact that members of the National Academies of Science have far higher rates of disbelief in god than other scientists or the general public. “In a poll taken in 1998, only 7 percent of the members of the US National Academy of Sciences, the elite of American scientists said they believed in a personal God.” (Victor Stenger, God: The Failed Hypothesis, p. 10.)

Charles Darwin is a good example of both aspects of this phenomenon. We know that he was religious in his youth and obtained a degree from Cambridge University in the sciences. At the time of his education, the prevailing view of life was special creation, that god specifically created species to make the fit into particular ecological niches. Nothing he learned at university dissuaded him from this belief and in fact he was strengthened in them and was considering a life as a clergyman. In his autobiography, he discusses how on his round the world voyage on the Beagle, the plants and animals and insects he saw in South America, seemed to challenge the view of special creation. In order to deal with this, he said he started inventing increasingly complex reasons to sustain his belief in special creation. He took this to such an extent that the sailors on the boat, although far less educated than him, found his explanations highly amusing. In other words, those much less educated than he could see the problems with the theory of special creation that he could not because not only did he not want to see them, he had the tools to explain them away.

But as he became more and more absorbed in his studies, went deeper and more global in his thinking, and tried to create an integrated theory to explain his findings, his religious beliefs just could not be sustained and he abandoned them completely, ceasing to believe not just in special creation, but in god as well.

The long-term solution to religion may not be more education but creating a climate where more doubt and skepticism are prevalent and acceptable. It is only then that education has something to work with. The current crop of high-profile books arguing against religion are creating just such a climate and are thus to be welcomed.

POST SCRIPT: Storms in tea cups

Lewis Black lets loose his frustration with political grandstanding over non-issues.

Why can’t science and religion get along?

Much of the recent attacks on religion have come from those with a scientific background. But there are many atheist scientists (such as the late Steven Jay Gould) who have not wanted to criticize religion the way the current crop of atheists are doing. They have tried to find a way for science and religion to coexist by carving out separate spheres for religion and science, by saying that science deals with the material world while religion deals with the spiritual world and that the two worlds do not overlap. Gould even wrote an entire book Rocks of Ages: Science and Religion in the Fullness of Life based on that premise.
This is not a new argument. Such appeals from high profile individuals tend to recur whenever there is a science-religion flare-up, such as during the evolution controversy leading up to the 1925 Scopes trial concerning the teaching evolution in schools. Edward L. Larson in his book Summer for the Gods (1997) writes (p. 121-122):

When the antievolution movement first began in 1923 [James] Vance [pastor of the nation’s largest southern Presbyterian church] and forty other prominent Americans including [Princeton biologist Edwin G.] Conklin, [American Museum of Natural History president Henry Fairfield] Osborn, 1923 [Physics] Nobel Laureate Robert Millikan, and Herbert Hoover, tried to calm the waters with a joint statement that assigned science and religion to separate spheres of human understanding. This widely publicized document describes the two activities as “distinct” rather than “antagonistic domains of thought,” the former dealing with “the facts, laws and processes of nature” while the latter addressed “the consciences, ideals and the aspirations of mankind.”

This argument, that the existence of god is something about which science can say nothing so scientists should say nothing, keeps appearing in one form or another at various times but simply does not make sense. Science has always had a lot to say about god, even if not mentioning god by name. For example, science has ruled out a god who created the world just 6,000 years ago. Science has ruled out a god who had to periodically intervene to maintain the stability of the solar system. Science has ruled out a god whose intervention is necessary to create new species. The only kind of god about which science can say nothing is a god who does nothing at all.

As Richard Dawkins writes (When Religion Steps on Science’s Turf, Free Inquiry, vol. 18 no. 2, 1998 (pp. 18-9), quoted in Has Science Found God?, Victor J Stenger, 2001):

More generally it is completely unrealistic to claim, as Gould and many others do, that religion keeps itself away from science’s turf, restricting itself to morals and values. A universe with a supernatural presence would be a fundamentally and qualitatively different kind of universe from one without. The difference is, inescapably, a scientific difference. Religions make existence claims, and this means scientific claims.
There is something dishonestly self-serving in the tactic of claiming that all religious beliefs are outside the domain of science. On the one hand, miracle stories and the promise of life after death are used to impress simple people, win converts, and swell congregations. It is precisely their scientific power that gives these stories their popular appeal. But at the same time it is considered below the belt to subject the same stories to the ordinary rigors of scientific criticism: these are religious matters and therefore outside the domain of science. But you cannot have it both ways. At least, religious theorists and apologists should not be allowed to get away with having it both ways. Unfortunately all too many of us, including nonreligious people, are unaccountably ready to let them. (my italics)

Victor Stenger in his book God:The Failed Hypothesis (p. 15) points out that the idea that science and religion occupy separate spheres is also in contradiction to actual practice: “[A] number of proposed supernatural or nonmaterial processes are empirically testable using standard scientific methods. Furthermore, such research is being carried out by reputable scientists associated with reputable institutions and published in reputable scientific journals. So the public statements by some scientists and their national organizations that science has nothing to do with the supernatural are belied by the facts.”

Dawkins and Stenger make a strong case. So why are some scientists supportive of such a weak argument as that science and religion occupy distinct and non-overlapping domains? Stenger (p. 10) suggests a reason:

Nevertheless, most scientists seem to prefer as a practical matter that science should stay clear of religious issues. Perhaps this is a good strategy for those who wish to avoid conflicts between science and religion, which might lead to less public acceptance of science, not to mention that most dreaded of all consequences – lower funding. However, religions make factual claims that have no special immunity from being examined under the cold light of reason and objective observation.

Is that it? Are scientists scared of criticizing religion for fear of upsetting the gravy train that funds their research? That is a somewhat cynical view but not one that can be dismissed easily.

Another possible reason may be (as I argue in my book Quest for Truth) that scientists are simply sick of arguing about whether science is compatible with religion, find it a time wasting distraction from their research, and use this ploy as a rhetorical escape hatch to avoid the topic whenever it arises.

Yet another reason may be that scientists do not generally know (or even care) what other scientists’ religious views are. A scientist’s credibility depends only on the quality of the science that person does, and all that is required for good science is a commitment to methodological naturalism within the boundaries of one’s area of research. A scientists’ attitude towards philosophical naturalism is rarely an issue. Because of this lack of relevance of the existence of god to the actual work of science, scientists might want to avoid altogether the topic of the existence of god simply to avoid creating friction amongst their scientific colleagues. As I said before, the science community has both religious and non-religious people within it, so why ruffle feelings by bringing up this topic?

But while I think that it is a good idea to keep religion out of scientific discussions since god is irrelevant when one is interpreting experimental results or comparing theories, there is no reason why scientists should not speak out against religion in public life. If we think that religion is based on a falsehood, and that the net effect of religion in the world is negative, we actually have a duty to actively work for its eradication.

I think that Baron D’Holbach (1723-1789) gave the best reason for campaigning against religion when he explained why he did so:

Many men without morals have attacked religion because it was contrary to their inclinations. Many wise men have despised it because it seemed to them ridiculous. Many persons have regarded it with indifference, because they have never felt its true disadvantages. But it is as a citizen that I attack it, because it seems to me harmful to the happiness of the state, hostile to the march of the mind of man, and contrary to sound morality, from which the interests of state policy can never be separated.

I agree with the Baron.

Next: Is more education the answer?

POST SCRIPT: Rationality and religion

“Rational arguments don’t usually work on religious people. Otherwise there would be no religious people.” From another great little video clip from the TV show House.

Does religion play a uniquely useful role?

The recent appearance of best-selling books by atheists strongly criticizing religion has given rise to this secondary debate (reflected in this blog and the comments) as to what attitude atheists should take towards religion. Some critics of these authors (including fellow atheists) have taken them to task for being too harsh on religion and thus possibly alienating those religious “moderates” who might be potential allies in the cause of countering religious “extremism”. They argue that such an approach is unlikely to win over people to their cause. Why not, such critics ask, distinguish between “good” and “bad” religion, supporting those who advocate good religion (i.e., those parts of religion that encourage good works and peace and justice) and joining with them to marginalize those who advocate “bad” religion (i.e., who use religion divisively, to murderous ends, to fight against social justice, or to create and impose a religion-based political agenda on everyone.)

It is a good question deserving of a thoughtful answer, which you are unlikely to find here. But I’ll give it my best shot anyway.
[Read more…]

Atheist/theist or naturalist/religious?

If one tries to categorize people by their beliefs about god, then there are many categories into which people fall (all definitions in quotes are from the Merriam-Webster online dictionary): the religious believe in the existence of a god who can and does intervene in the events of the universe; the deist is part of “a movement or system of thought advocating natural religion, emphasizing morality, and in the 18th century denying the interference of the Creator with the laws of the universe”; the pantheist “equates God with the forces and laws of the universe”; the agnostic is “one who is not committed to believing in either the existence or the nonexistence of God or a god”; and the atheist is one “who believes that there is no deity.”

Most philosophical discussions about religion (and opinion polls that try to measure the prevalence of religious beliefs) tend to divide people along the theist-nontheist fault line, where a theist is one who believes “in the existence of one God viewed as the creative source of the human race and the world who transcends yet is immanent in the world” and nontheist consists of everyone else. If you divide people up this way, then the groups that I have labeled as religious, deists, and pantheists all end up on the theist side of the split; atheists fall on the nontheist; and agnostics straddle the divide.

While the theist-nontheist divide can lead to interesting discussions about important philosophical points, as a practical matter it usually goes nowhere, because there is no operational way of distinguishing between the deist, pantheist, agnostic, and atheist points of view.

The problem with the theist-nontheist division is that it depends on self-identification and all kinds of highly variable subjective factors come into play in deciding what label one assigns to oneself or to others. For example, almost all atheists will readily concede that the non-existence of god, like the non-existence of fairies and unicorns, cannot be proven and that therefore there is always the logical possibility that god exists. As a result, some atheists will prefer to describe themselves as agnostics, since the term atheist erroneously, but popularly, connotes the idea that such people know for certain that god does not exist.

Take the case of Charles Darwin. By the time he reached the age of forty, he was to all intents and purposes an atheist. The shift from belief to disbelief had been steady and inexorable. The more he learned about the laws of nature, the less credibility miracles and the doctrines of Christianity had for him. He considered the idea of a personal, benevolent, omnipotent god so illogical that he said it “revolts our understanding.” Darwin wrote that he:

“gradually came to disbelieve in Christianity as divine revelation.” There was no smugness and no hastiness to his loss of faith; it happened almost against his will. “Thus disbelief crept over me at a very slow rate, but was at last complete.” (The Reluctant Mr. Darwin, David Quammen, p. 245, my italics)

After he lost his faith, he had no doubts or anxiety about it but when pressed to give a label to his religious views he said, “The mystery of the beginning of all things is insoluble by us; and I for one must be content to remain Agnostic.” Darwin apparently shied away from the label “atheist” as being too aggressively confident, which went against his own cautious and non-confrontational personality, and he took refuge in the new word agnostic, which had been coined by his friend and colleague T. H. Huxley to meet the philosophical needs of just such people. I think many people who are functionally atheists share Darwin’s unease with calling themselves that. When I tell people that I am an atheist, for example, they often try to persuade me that I must “really” be an agnostic since I readily concede that I cannot be sure that there is no god.

Similarly, many people probably choose to call themselves deists or pantheists, not because they have any evidence for the existence of the deity, but because they seem to feel that if there is no god at all then there is no meaning to life. Since they desire their life to have meaning, the idea of there being a god is comforting and appealing to them and they seek to find some way to hold on to it, despite the lack of evidence. Deism and pantheism offers such an option without also having to accept the absurdities that formal religions require, like infallible texts and miracles. It allows one to have a god to give one’s life meaning for those who need such an external source, while not compromising one’s belief that the world behaves in accordance with natural laws.

I feel that a more operationally useful classification scheme would to sort people according to their answer to the question “Do you think that god in any way intervenes in the course of events contrary to natural laws?” In other words, we should ask what their views are on what people normally consider miracles. Those answering in the affirmative would be classified as religious, and those answering in the negative (atheists, deists, and pantheists) would be grouped under the umbrella term naturalist, with the name being selected because all these people see the world operating solely under the influence of natural laws. Most agnostics, other than those who are doggedly determined to not commit themselves, should also be able to answer this question definitely and decide which of the two groups they feel most closely fits them.

(If agnostics are still not sure how to answer, a more concrete version of the question might be to ask them: “If the person whom you respect and trust the most and know to be very religious said that god had spoken to him or her and had wanted a message conveyed to you to give away all your money and possessions to charity, would you do it?” If you do not think this could have happened and refuse the command with no hesitation, then you are operationally a naturalist. If you say you would do it or are not sure what you would do, then you are effectively a religious person. I suspect that most agnostics will fall into the naturalist camp, since agnostics do not usually expect god to actually do anything concrete.)

This kind of naturalist-religious divide provides a more useful classification scheme since it is based on whether there is any observable difference in the behavior of people as a consequence of their beliefs, rather than on their beliefs themselves. The members of the naturalist group (the atheist, the deist, the pantheist, and most agnostics) all live their lives on the assumption that god does not intervene in life in any way. None of them pray or ask for god to intervene. (Those who did pray would be switched from into the religious group because then they effectively believe in an interventionist god.) As I have said many times before, what people say they believe is of little consequence except insofar as it influences their actions.

So the religious-naturalist divide based on the answer to the question “Do you think that god in any way changes the course of events contrary to natural laws?” is, to my mind, a much better measure of the level of belief in god than the theist-atheist divide. It would be nice to see polls conducted on this question. My suspicion is that there are far more naturalists (i.e. functional unbelievers) than one might suspect.

Next: Should atheists seek to undermine religion or does religion play a valuable role that makes it worth preserving?

POST SCRIPT: Sputnik trivia

Last week saw the fiftieth anniversary of the launching into space by the Soviet Union of the satellite Sputnik, an event that galvanized the US space program. In 1999, a nice film called October Sky was released based on the true story of a group of high school students in a small mining town in West Virginia who were inspired by Sputnik to build rockets on their own. The film was based on the memoir Rocket Boys of one of the boys Homer Hickam, who later did become a rocket scientist for NASA.

A curious feature of this story is that the name October Sky is an anagram of the name Rocket Boys.