Charlatans of the paranormal

The magician James Randi (whose stage name is ‘The Amazing Randi’) is quite a remarkable person. In addition to his day job as a professional magician, he has a secondary career debunking those whom he sees as charlatans and who use ordinary magic trickery to enrich themselves by fooling gullible people into thinking that they have supernatural powers.

I saw Randi in person when I was in graduate school where he gave a performance of his magic to the student body, and then gave a colloquium in the physics department. In each case, he first did various impressive tricks such as bending spoons and changing the time on people’s watches without seemingly touching them, and escaping after being chained and put into a sack. He ended with a talk warning everyone that what he did was due to pure sleight of hand and deception, and that anyone who claimed to be using powers such as telekinesis, spiritual energy, and the like to do such things was simply lying.

At that time, one of Randi’s targets was Uri Geller who claimed that he had paranormal powers that enabled him to bend spoons without touching them, to see what was inside sealed envelopes, identify which of several closed identical containers had water inside, and so on. Geller had made quite a name for himself and was invited in 1973 to show his prowess on NBC’s The Tonight Show, then hosted by Johnny Carson. But Carson was no fool. He had started his own entertainer career at age 14 as a magician called “The Great Carsoni” and was well aware of the possibility of trickery. So Carson hired Randi as a consultant for the show and Randi advised him what he should do to make sure that if Geller did what he claimed he could do using paranormal powers, they were not due to simple trickery. You can see the clip of Geller’s appearance here. Thanks to Randi’s advice and Carson’s vigilance, Geller’s performance was a total bust. He could not do anything and ended up pleading that he was ‘feeling weak’ that day. He disappeared in disgrace for awhile but seems to be coming back again, hoping that people have forgotten that fiasco.

Another Randi/Carson expose in 1987 was that of preacher Peter Popoff who bilked gullible and poor religious believers out of millions of dollars by claiming that god spoke to him and told him things about them that enabled him to heal them. It turned out that the voice he heard was not that of god but that of his wife speaking through a receiver hidden in his ear who was telling him things that she had learned about the people Popoff was supposedly healing. After being exposed on Carson’s show, Popoff too lay low for awhile but recently he is also back with the same swindle, preying on the gullible.

In his long-term quest to show that people’s claims of having paranormal powers are a fraud, Randi has set up an educational foundation, and an anonymous donor has offered a $1 million reward to anyone who passed a test to identify the genders of the authors of 20 diaries by touching the covers, and getting at least 16 right. In 10 years, no one has succeeded with the best result being 12 right. Some prominent psychics have stayed away and one can understand why. Their fame and fortune depends on gullible people believing in them and they are unlikely to risk being exposed as frauds.

But suppose someone did come along who got 16 right? Would that prove that they had paranormal powers? No. Since there is a 50-50 chance of guessing right for each diary, the probability of getting at least 16 out of 20 right is 0.006 or 6 in 1,000 or about one chance in 167. This is unlikely but not that rare. To convince a skeptic like me to believe in the paranormal would require evidence that approaches certainty. To provide convincing proof of the paranormal, a person claiming to have such powers should be able to get everything right and be able to do it at any time.

This question of repeatability of such proofs is important. It is quite possible to have even an extremely unlikely event occur by chance and that would prove nothing. It is possible to get hit by lightning even if Zeus is not deliberately aiming thunderbolts at you.

What is interesting is that psychics around the world keep claiming to have supernatural powers but can never produce them under scrutiny. In Sri Lanka, we had our own rationalist champion named Abraham Kovoor who in his day also offered a monetary reward to the many ‘god-men’ in the region (people who claimed that they had supernatural powers because they were an incarnation of god) if they could read the serial number of a currency note in a sealed envelope. Kovoor went to his death at a ripe old age with his money safe.

Because of the lack of any confirmed positive evidence, I think that the logical and rational thing to do is to assume that every kind of paranormal phenomenon that has been postulated simply does not exist, just as an afterlife does not exist.

Of course, true believers in the existence of the supernatural will find all kinds of excuses for the absence of any evidence for it. For example, a common demonstration used by the ‘god-men’ in India and Sri Lanka to convince their devotees that they had supernatural powers was to wave their hands and produce, seemingly out of thin air, ‘holy ash’, the kind which devotees rub across their foreheads, similar to what some Christians do on Ash Wednesday. I have friends who believe in one of these ‘god men’ (famous in South Asia) called Sai Baba and they tell me stories like this to persuade me that their belief is rational. (See this site for exposes of Sai Baba.)

I recall a time when Kovoor staged a public demonstration where he did the very same thing that the ‘god men’ claimed they could do using divine powers. But true believers were unfazed. One person wrote to the newspaper that Kovoor’s display did not prove anything at all because while the ‘god-men’ produce holy ash, Kovoor had only succeeded in producing ordinary ash!

As the old saying goes, there are none so blind as those who will not see.

POST SCRIPT: How the ten commandments came about

Sometime ago, I wrote that the ten commandments looked like something cobbled together by a frustrated committee struggling to come up with a round number of items. It turns out that this is exactly how they came about, with the committee consisting of god, his personal assistant Larry, and Jesus.

For all the Mr. Deity clips, see here.

The war propaganda machine grinds on . . .

And so year five begins . . .

Today marks the beginning of the fifth year of the endless war of death and destruction that is destroying Iraq and its people. It is an appropriate time to focus attention on all those responsible for this atrocity, starting with the entire Bush administration, the neoconservative clique that surrounds the administration, the war cheerleaders in the so-called ‘think tanks’ like the American Enterprise Institute, and the pundits in the media who provided the intellectual cover for them. Robert Parry looks at how “the four-year-old conflict resulted from a systemic failure in Washington – from the White House, to congressional Republicans and Democrats, to an insular national news media, to Inside-the-Beltway think tanks.”
[Read more…]

Proof of the afterlife

Recently a friend of mine posed an interesting question. She said that none of us really know for sure if there is life after death or not, although all of us have our own beliefs. She wondered how differently we would live our lives if we could have conclusive proof either way. This led to an interesting discussion about what would constitute proof in such situations.
[Read more…]

The Power of Film

Films can have an enormous emotional impact on a viewer, swaying them emotionally in ways that their intellect would oppose. I was reminded of this recently when I watched two films from the silent era, Buster Keaton’s The General (1927) and D. W. Griffiths’ Birth of a Nation (1915). The latter was one of the earliest American feature films (the first being made in 1912) with the very first being made in Australia in 1906.

It was purely a coincidence that I happened to watch two films from the silent era so close to each other because the reasons were quite different. I had always wanted to see a Buster Keaton film because I had read that he was a pioneering genius of the silent film comedy genre. I watched Griffiths’ film as part of the College Scholars Program that I help teach.

Coincidentally, both films involved the Civil War and were told from a viewpoint that was sympathetic to the Confederacy. The first thing that struck me about both was how modern they were in the way they told their stories. They did have obvious signs of being old, such as the lack of sound and color and special effects, and poor quality film stock. But apart from these purely physical factors, the narrative structure was surprisingly familiar with flashbacks being the only modern feature of films that was missing.

Because of the lack of spoken dialogue, the actors had to exaggerate their gestures a little in order for the viewer to get a sense of their emotions and what they were saying, but apart from that, these were both films that kept the viewer engrossed in their respective stories. Despite the fact that the films had no spoken words (or because of it?), they were both fast-moving and kept the viewer engaged.

But there the similarities ended.

The General is a comedy in which the two warring sides were just a backdrop for a simple story of a train engineer (Buster Keaton) whose girl friend and train (named The General) were captured by the other side. The entire film dealt with the engineer’s foray into enemy territory to get them both back home.

This is not a political film. The entire film could have been done with the two sides interchanged and all that would have been necessary would have been to switch the army uniforms. The fact that it was the Civil War was also immaterial. Any two warring factions would have served equally well. In fact there was not a single black person in the whole film (at least as I recall). The fact that the engineer and girl were from the South was seemingly due to the idea for the film coming from an actual incident in the war. This film is worth watching, if only to see how well Keaton did all the stunts himself.

Birth of a Nation, on the other hand, is a very political film, determined to drive home a very specific message. I had heard of the film before and the comments were of two kinds: (1) that it was a landmark in the development of modern film; and (2) that it was terribly racist. After seeing the film, I have to agree with both judgments.

The film (which runs a little over three hours, surprisingly long for that period) consists really of two parts. The first part starts just prior to the Civil War and deals with events leading up to its end and Lincoln’s assassination. The second part deals with the period of Reconstruction in the south immediately afterwards.

The first part starts with an idyllic portrayal of life before the Civil War, with the stories of two large happy white families – one from the north, the other from the south – who are friends and visit one another, and the budding romances of one son and daughter from one family with son and daughter from the other. The war then pits the boys against each other in battle and produces deaths in each family.

This first part of the film is not too offensive and if the film had ended at this point there would not have been much controversy. The chief criticism that would have been leveled at it would have been the portrayal of all blacks as ‘happy slaves,’ either cheerfully loyal to their masters as house servants or happily working in the cotton fields and waving to the masters as they walk by. Lincoln is portrayed as a good man who did not want to seek vengeance in the South after the North’s victory.

But the second part is set entirely in the south and deals with the Reconstruction following Lincoln’s death. This is where the film’s highly disturbing treatment of race becomes manifest. This period is portrayed as a time when blacks took complete control of life in the South, shutting out white voters in elections and thus getting majorities in the legislatures. The southern whites are portrayed as a horribly oppressed people, being pushed aside by blacks in the streets and suffering various other indignities. The blacks are entirely caricatured, with white actors in blackface portraying them as lazy and drunken and evil, shuffle-dancing in the streets, lecherously leering at the demure white women, and always rubbing it in to the whites that they were now the bosses. Only the faithful house slaves stayed loyal to the whites, to the extent of rescuing them from black mobs at great peril to themselves.

The first part of the film, by showing scenes of these two loving, courteous families, with children playing and puppies and kittens frolicking, suffering the tragedies of their family members being killed in the war, etc., had already created sympathy for them in the minds of the viewer. The only black people who emerged as recognizable characters appeared in the second part, and were two-dimensional portrayals of evil so that the viewer had no sympathy for them at all.

But the real shocker is that the film portrays the creation of the Ku Klux Klan during the Reconstruction as a response by these decent, law-abiding whites to the lawlessness created by black rule. It is started by one of the family members we have already identified with, who is appalled by the breakdown of order and merely seeking to right wrongs. The KKK’s reign of terror is also not portrayed. Only one black person is shown being ‘tried’ and found guilty by the KKK and has his body later dumped at the home of the evil black leader. Instead the people of the KKK were protrayed like comic-book heroes, ‘respectable’ citizens who adopt secret identities to fight crime and injustice. Only in this case the costumes that hide their identities are the notorious white sheets.

There is no surer way of gaining an audience’s sympathy than setting up a scene in which a plucky little band of good people (including the elderly, women, children, and pets) heroically fight overwhelming odds against an evil and faceless enemy. This is a time-tested method of swaying the viewer’s sympathies and is a staple of cowboy films. Griffiths heavily exploits this towards the end of Birth of a Nation. So powerfully had the deck been emotionally stacked in favor of the white families that in the climactic scene, when the tiny group of white people is trapped in a small house and surrounded by a large number of advancing hostile black Union soldiers, I found myself rooting for their rescue, even though the rescue was going to be by the KKK.

The spell cast by Griffiths was broken whenever the scene cuts to show the KKK riding in to save this group because the sight of people covered in white sheets now has an overwhelmingly negative emotional impact. But one can imagine how in 1915, just fifty years after the Civil War ended, this film could be seen a huge propaganda coup for the KKK, showing them in an entirely positive light. Although the KKK had been dormant for some time, 1915 saw the second resurgence of this group and the timing of that had to have something to do with the release of this film.

The fact that Griffith was able to portray a group like the KKK in such a sympathetic light is a warning about the dangerous power that films can have in shaping attitudes and sympathies. It illustrates the importance of having people realize that films and other forms of video can never be taken as the only source of knowledge. We cannot avoid the hard work of reading about and around important events, both historical and contemporary, if we are to piece together a reasonably accurate understanding of events.

POST SCRIPT: Mr. Deity returns

Mr. Deity is taken on a tour of hell by Lucifer.

For all the Mr. Deity clips, see here.

How the media patronizes us

The presidential election campaign for 2008 has already started with a whole host of declared and undeclared candidates running. George Bush’s performance seem to have persuaded people that anyone can do a better job than him.

On the Democratic side, we have Joe Biden, Hillary Clinton, Christopher Dodd, John Edwards, Mike Gravel, Dennis Kucinich, Barack Obama, and Bill Richardson. (Tom Vilsack has already dropped out.)

On the Republican side, there is Rudy Giuliani, Duncan Hunter, John McCain, Ron Paul, Mitt Romney, Tom Tancredo, and possibly Newt Gingrich, Chuck Hagel, and Fred Thompson.

All the candidates face stiff hurdles in getting their respective nominations. But the reality is that almost all of them have no chance. It is not because they are not good candidates or are incapable of being president or have unsavory histories but because they have two inter-related issues that work against them right form the start.

One of those issues is the ability to raise money. It requires a lot of money to run a presidential campaign. This is something that everyone is aware of. The less obvious but related issue is that the media has already made a judgment about who is ‘worthy’ and capable of being president and some of the candidates have already been written off. The coverage of their campaigns will reflect this bias against them and this will adversely affect those candidates’ ability to raise money and gain name recognition.

It is clear that the media has already chosen the following as the ‘viable’ candidates based on nothing more than their own preferences. For Democrats they are Clinton, Obama, and Edwards. For the Republicans, they are Giuliani, McCain, and Romney.

The media will be either dismissive of the others, or treat them as distractions, or use them as fodder to provide ‘color’ to the campaigns. For example, Michael McIntyre says Kucinich’s in his ‘Tipoff’ column in the Plain Dealer on January 20, 2007 described Kucinich’s campaign as ‘futile.’ On what basis? He does not say. The fact is that Kucinich and Paul are the only Congresspeople running for president who had the foresight to vote against the Authorization for Use of Military Force Against Iraq Resolution of 2002, the disastrous law that George Bush used to wage his illegal and immoral invasion of Iraq. But that seems to count for nothing in the minds of the media who continue to give prominence to the politicians and pundits who have been consistently wrong on everything concerning this war. (Obama was also against the war but not in Congress at that time.)

This is not a new phenomenon. The pack of media journalists that follow campaigns as a group has long tended to decide early on which candidates ‘deserve’ serious consideration, or even are worthy of being president and slant their coverage accordingly. Jonathan Schwarz describes an experience he had many years ago that illustrated to him that “the government and corporate media self-consciously see themselves as a governing elite that runs things hand in hand.”

Washington Post columnist Richard Cohen came to talk at Yale in 1988, just after I arrived. Following schmancy Yale tradition, he had tea with a small group of students and then ate dinner with an even smaller group. I weaseled my way into attending.

Gary Hart had recently flamed out in the ’88 presidential race because of Donna Rice. And at dinner Cohen told all us fresh-faced, ambitious, grotty youths this:

The Washington press corps had specifically tried to push Hart out of the race. It wasn’t because Hart had had extramarital affairs—everyone knew this was the norm rather than the exception among politicians. So Hart wasn’t at all unusual in this respect. Instead, Cohen said, it was because the press corps felt that Hart was “weird” and “flaky” and shouldn’t be president. And when the Donna Rice stuff happened, they saw their opening and went after him.

(I wish I remembered more about what Cohen said about the specific gripe of the press corps with Hart, but I don’t think he revealed many details.)

At the time, I remember thinking this:

1. How interesting that the DC press corps knows grimy details about lots of politicians but only chooses to tell the great unwashed when they decide it’s appropriate.

2. How interesting that the DC press corps feels it’s their place to make decisions for the rest of America; ie, rather than laying out the evidence that Hart was weird, flaky, etc., and letting Americans decide whether they cared, they decided run-of-the-mill citizens couldn’t be trusted to make the correct evaluation.

3. How interesting that Cohen felt it was appropriate to tell all this to a small group of fresh-faced, ambitious, grotty Yale youths, but not to the outside world. And how interesting that we were being socialized into thinking this was normal.
. . .
If you’re not part of their little charmed circle, believe me, all your worst suspicions about them are true. They do think you’re stupid. They do lie to you. They do hate and fear you. Most importantly, they think you can’t be trusted with the things they know—because if you did know them, you’d go nuts and break America.

CBS News’s Dick Meyer confirms the fact that the media often decides to not tell the public the truth about political leaders:

This is a story I should have written 12 years ago when the “Contract with America” Republicans captured the House in 1994. I apologize.

Really, it’s just a simple thesis: The men who ran the Republican Party in the House of Representatives for the past 12 years were a group of weirdos. Together, they comprised one of the oddest legislative power cliques in our history. And for 12 years, the media didn’t call a duck a duck, because that’s not something we’re supposed to do.

The situation now is not unlike that which existed earlier when Thomas Jefferson said:

Men by their constitutions are naturally divided into two parties: 1. Those who fear and distrust the people, and wish to draw all powers from them into the hands of the higher classes. 2. Those who identify themselves with the people, have confidence in them, cherish and consider them as the most honest and safe, although not the most wise depository of the public interests.

It seems clear to me that the members of the mainstream media and the political classes today tends to fall into the first group. But for a healthy democracy, it is important that we advocate belonging to the second group. This is why I think that citizenship means that we do not accept what is given to us by the media but be active seekers of knowledge.

Life is coarse grained, research is fine grained

In a celebrated remark in the case Jacobellis v. Ohio (1964) involving “hard core pornography”, US Supreme Court Justice Potter Stewart said that “I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description; and perhaps I could never succeed in intelligibly doing so. But I know it when I see it.”

This is a common problem that we all face. There are things that we “know” in a general sense but which we cannot strictly define. Pornography is just one of an infinite class of problems for which we have broad brush definitions (i.e. we think we know it when we see it) but which almost always break down under close examination, and exceptions to the definitions we create can always be found.

I am becoming convinced that this is a general feature of life. Questions have simple answers only when we don’t examine them too closely. Suppose, for example, I asked the question “What is the length of my desk?” you would expect that there is a definite length to it and that there should be a straightforward way to get the answer.

At the simplest level, you could take a ruler and measure it and call this the length. But is that the most accurate measurement? A ruler is, after all, a pretty coarse measuring instrument. You could get fancier and use more sophisticated devices such as laser beams and high precision timers to get increasing levels of precision. But at some point you reach a limit to precision because at a fundamental level, because of Heisenberg’s Uncertainty Principle, the length of the desk is not a well-defined quantity. This is because although the desk looks like an object with sharp boundaries, when you get to the sub-atomic level, we know that the atoms on the surface are quantum mechanical systems and so the edges of the desk are not sharply defined but instead are fuzzy and a blur. How do you measure the length of a blur?

At the large scales with which we normally work, we can ignore this and think of the desk as having a definite length but that is because we are not looking too closely.

For another example, although we all have a general intuitive idea about what is science and non-science, I have previously discussed how, when you look closely at the question, it is hard it is to strictly demarcate science from non-science. This is because the problem of finding necessary and sufficient conditions that demarcate one class of objects from another class of objects is very hard, and perhaps impossible.

While in everyday life we tend to be coarse-grained in our outlook, universities tend to be places where things are examined in fine-grained detail This is partly the reason why universities have received the label of “ivory towers.” To those outside the university it can seem like academics are engaged in research at a level of detail that seems pointless and the ‘ivory tower’ label is sometimes intended as an insult. But the reality is that universities are one of the few places where people try to examine things closely, to see how far we can go in defining things before we reach the limits at which things break down. While to outsiders this may seem like nitpicking, it is important to do this because the consequences of such fine-grained analyses can have practical consequences.

For example, most people have a clear idea of (say) what is alive and what is dead, of what is human and what is not. But those classifications are not as clear-cut as they can seem. What is considered dead for example, has changed over time, from ‘heart dead’ to ‘brain dead’ to ‘persistent vegetative state.’ Knowing the precise limits of knowledge in this area has important practical consequences. (See part 1, part 2, and part 3 of that series.)

It seems to be the case that as much as we might like to have certainty, we can never have it. At some point, we reach a level of detail where we have to make a decision, a judgment, as to what something is and what we need to do. This is why we often delegate to people (judges, doctors, academics, and other experts in each field) who have studied these things the right to make such judgments on our behalf. It is not that they are infallible and cannot be wrong, but because at least they work with an awareness of the limits of knowledge and of the ambiguities that exist at the fine-grained level.

The undogmatic dogmatism of scientists

In a recent online discussion about whether intelligent design creationism should be taught as part of science, one of the participants took exception to a statement by someone else that the theory of evolution is so well established that it was of no use to allow for the inclusion of intelligent design creationism. The challenger asked, quite reasonably: “On what things is there no room for debate? Of what things are we so certain that we’re willing to close the door to possibilities? If academics allow themselves to appear dogmatic about their theories, we legitimize dogmatism. We should be careful that scientists themselves do not become the new proselytizers to claim they hold absolute truth.”

This puzzlement is not uncommon and not unjustified. Seen from the outside, scientists must seem as if we either cannot make up our minds as to what we know for certain and what we are unsure of, or we are accused of cynically shifting our position for polemical advantage, sometimes arguing that evolution is a fact beyond dispute (in order to exclude intelligent design creationism as a viable competitor) while also asserting that intelligent design creationism is not scientific because it is not falsifiable. On the surface, those two positions seem inconsistent, applying different criteria to the two theories.
It is true that scientists assert that “evolution is a fact,” just as they assert that “gravity is a fact.” They also acknowledge the “theory” of evolution and the “theory” of gravity. And they also assert that ALL knowledge is provisional and subject to change.

How can all these things be simultaneously true? How can something be at the same time a fact and a theory, certain and yet subject to change? These are deep questions and ones that can lead to heated discussions since they affect deeply held core beliefs about science and religion.

These also happen to be questions that form the core of the seminar course I teach to sophomores. We discuss all kinds of things in my course including science and religion, intelligent design etc. and it is remarkable that in the four years that I have taught it, there have been absolutely no blowups or confrontations or unpleasantness, although colleagues have told me that these very same questions have caused problems in their classes. The relative harmony of my class exists despite the fact that I know that many of my students are quite religious, from a variety of traditions, and they know that I am an atheist. These personal beliefs are not things that we keep secret because they shed important perspectives on the discussions.

Perhaps the reason for the lack of friction is that my course starts with looking closely at what science’s knowledge structure is. We read Pierre Duhem, Karl Popper, Thomas Kuhn, Imre Lakatos, Larry Laudan and other historians and philosophers of science and see how it is that science, unlike other areas of knowledge, progresses rapidly because of the commitment of its practitioners to a paradigm in which the framework in which problems are posed and solved are well defined. The paradigm consists of a scientific consensus about which theory (or a set of closely related theories) should be used for analyzing a problem, rules for determining what kinds of research problems are appropriate, the kinds of evidence, arguments, and reasoning that are valid, and the conditions that solutions to these research problems must satisfy if they are deemed to be satisfactory. That complex paradigmatic framework is sometimes loosely and collectively referred to as a “theory” and students quickly realize that the popular meaning of the word “theory” as some sort of simple hypothesis or guess does not apply in the scientific realm.

As long as that paradigmatic framework (or “theory”) is fruitful and brings forth new problems and successes, it remains inviolate from challenges, and practitioners strenuously resist attempts at overthrowing it. The “theory” is thus treated and defended as if it were a “fact” and it is this that is perceived by some outside of science as dogmatism and an unwillingness to change.

But as Kuhn so persuasively argues, it is this very commitment to a paradigm that is the reason for science’s amazing success, because the scientist working on a problem defined within a paradigm can be assured a priori that it is legitimate and important, and that only skill and ingenuity stands between her and the solution. Solving such problems within a paradigm is a sign of superior skill and brings rewards to the scientist who achieves it. Such conditions ensure that scientists will persevere in the face of challenges and adversity, and it is this kind of dogged determination that has resulted in the scientific breakthroughs from which we now benefit.

Kuhn likens this commitment of scientists to a paradigm to that of an industrialist to the manufacturing process that exists to make a particular product. As long as the product is made well, the manufacturer is not going to retool the factory because of the enormous effort and costs involved. Similarly, learning how to successfully exploit a scientific paradigm involves a long period of scientific apprenticeship in a discipline and scientists are unlikely to replace a working paradigm with another one without a very good reason. Learning to work well within a new paradigm is as costly as retooling a factory, and one does not do so cavalierly but only if one is forced into it. The dogmatism of science is thus pragmatic and not ideological.

But we do know that scientific revolutions, both major and minor, occur periodically. Very few of our current paradigms have a long history. So how and why do scientific paradigms change? They occur when the dominant paradigm shows signs of losing its fruitfulness, when it fails to generate interesting new problems or runs out of gas in providing solutions. It is almost never the case that one (or even a few) unsolved problems result in its overthrow because all scientific paradigms at all times have had many unsolved problems. A few counterexamples by themselves are never sufficient to overthrow a paradigm, though they can be a contributing factor. This is the fundamental error that advocates of intelligent design creationism (IDC) make when they argue that just because evolution by natural selection has not as yet explained some phenomena, Darwin’s theory must be rejected.

To be taken seriously, a new paradigm must also promise to be more fruitful than its predecessor, open up new areas of research, and promise new and interesting problems for scientists to work on. It does that by postulating naturalistic mechanisms that make predictions that can be tested. If it can do so and the predictions turn out to be successful, the commitment to the existing paradigm can be undermined, and the process begins by which the paradigm may be eventually overthrown. IDC has never come even close to meeting this requirement.

Some people have challenged the idea that scientific theories have to have as necessary conditions that they be naturalistic and predictive, arguing that insisting they be so is to impose dogmatic methodological rules. But the requirement that scientific theories be naturalistic and predictive are not ad-hoc rules imposed from outside. They follow as a consequence of needing the paradigm to be able to generate new research programs. How could it be otherwise?

This is why IDC, by pointing to a few supposedly unsolved problems in evolutionary theory, has not been able to convince the biology community of the need to change the way they look at things. Intelligent design creationism does not provide mechanisms and it does not make predictions and has not been able to produce new research.

When we discuss things in the light of the history of science, the students in my class understand why science does things the way it does, why it determinedly holds on to some theories while being willing to abandon others, and that this process has nothing to do with dogma in the traditional religious sense. Religious dogma consists of a commitment to an unchanging core set of beliefs. Scientific “dogma” (i.e. strong commitment to a paradigm and resistance to change) is always provisional and can under the right conditions be replaced by an equally strong commitment to a new “dogma.”

Almost all my students are religious in various ways, and while some find the idea of IDC appealing, they seem to have little difficulty understanding that its inability to enter the world of science is not a question of it being right or wrong, but is because of the nature of science and the nature of IDC. IDC simply does not fit into the kind of framework required to be a fruitful scientific theory.

Torture on 24

The willingness of our so-called intellectuals to use fiction as a basis for justifying barbaric policy decisions is truly astounding.

I have written before about how people who should know better (and probably do) continue to evoke the TV program 24 to justify the use of torture because the main character routinely uses it to extract information from captives. It should come as no surprise that the creator of that program Joel Surnow describes himself as a “Bush fan” and plans to continue to use torture even though people who do interrogations professionally say that such practices are actually harmful.

In a recent New Yorker article (Whatever it takes by Jane Mayer, February 19, 2007), some senior army interrogators and trainers of soldiers tried to get the program to not push this idea so much because it was giving army recruits the wrong idea of what kinds of interrogation techniques work, let alone are legal.

This past November, U.S. Army Brigadier General Patrick Finnegan, the dean of the United States Military Academy at West Point, flew to Southern California to meet with the creative team behind “24.” Finnegan, who was accompanied by three of the most experienced military and F.B.I. interrogators in the country, arrived on the set as the crew was filming. At first, Finnegan—wearing an immaculate Army uniform, his chest covered in ribbons and medals—aroused confusion: he was taken for an actor and was asked by someone what time his “call” was.

In fact, Finnegan and the others had come to voice their concern that the show’s central political premise—that the letter of American law must be sacrificed for the country’s security—was having a toxic effect. In their view, the show promoted unethical and illegal behavior and had adversely affected the training and performance of real American soldiers. “I’d like them to stop,” Finnegan said of the show’s producers. “They should do a show where torture backfires.”

Finnegan told the producers that “24,” by suggesting that the U.S. government perpetrates myriad forms of torture, hurts the country’s image internationally. Finnegan, who is a lawyer, has for a number of years taught a course on the laws of war to West Point seniors—cadets who would soon be commanders in the battlefields of Iraq and Afghanistan. He always tries, he said, to get his students to sort out not just what is legal but what is right. However, it had become increasingly hard to convince some cadets that America had to respect the rule of law and human rights, even when terrorists did not. One reason for the growing resistance, he suggested, was misperceptions spread by “24,” which was exceptionally popular with his students. As he told me, “The kids see it, and say, ‘If torture is wrong, what about “24”?’ ” He continued, “The disturbing thing is that although torture may cause Jack Bauer some angst, it is always the patriotic thing to do.”

Gary Solis, a retired law professor who designed and taught the Law of War for Commanders curriculum at West Point, told me that he had similar arguments with his students. He said that, under both U.S. and international law, “Jack Bauer is a criminal. In real life, he would be prosecuted.”
. . .
The third expert at the meeting was Tony Lagouranis, a former Army interrogator in the war in Iraq. He told the show’s staff that DVDs of shows such as “24” circulate widely among soldiers stationed in Iraq. Lagouranis said to me, “People watch the shows, and then walk into the interrogation booths and do the same things they’ve just seen.”

But Surnow does not care for the testimony of experts in interrogation because, like his hero George W. Bush, what matters is what he feels in his gut: “We’ve had all of these torture experts come by recently, and they say, ‘You don’t realize how many people are affected by this. Be careful.’ They say torture doesn’t work. But I don’t believe that.”

So we have TV program creators helping to create a mindset in the country where illegal and immoral acts are considered just fine. When combined with media commentators and academics who also advocate barbaric acts, it is depressing but perhaps not surprising that there is little outcry when we hear of the torture of people held in the war on terror.

As Austin Cline points out in his essay Medicalizing torture and torturing medicine, the widening rot that is produced by encouraging and condoning torture extends to the medical profession. Torture cannot take place without the complicity of doctors, nurses, and other medical personnel who have to treat the tortured and hide the evidence that it has occurred. Although the recent revelations about conditions at Walter Reed hospital had nothing to do with torture, he points out that it could not have escaped notice for so long without the complicity of medical personnel as well and he argues that it is due to a public mindset that is becoming increasingly comfortable with people being dehumanized.

Once we shrug our shoulders at people being tortured and rationalize it by saying that they would be treated worse by other countries, it is not that far a step to view mistreated hospital patients as whiners who should be grateful for what they get rather than complain about what they don’t get.

A low-brow view of books

In yesterday’s post, I classified the appreciation of films according to four levels. At the lowest level is just the story or narrative. The next level above that is some message that the director is trying to convey and which is usually fairly obvious. The third level is that of technique, such as the quality of dialogue and acting and directing and cinematography and sound and lighting. And then there is the fourth and highest level, which I call deep meaning or significance, where there is a hidden message which, unlike the message at the second level, is not at all obvious but which has to be unearthed (or even invented) by scholars in the field or people who have a keen sensitivity to such things. I classified people whose appreciation does not get beyond the first two levels as low-brow.

The same classification scheme can be applied to books, especially fiction. In recent years I have started reading mostly non-fiction, but when it comes to fiction, I am definitely low-brow. To give an example of what I mean, take the novels of Charles Dickens. I like them because the stories he weaves are fascinating. One can enjoy them just for that reason alone. The second level meanings of his books are also not hard to discern. Many of his books were attempting to highlight the appalling conditions of poor children at that time or the struggles of the petite bourgeoisie of England. That much I can understand and appreciate.

What about his technique, the third level that I spoke of? The fact that I (and so many others over so many years) enjoy his books means that his technique must be good but I could not tell you exactly what his technique is. It is not that I am totally oblivious to technique. His habit of tying up every single loose end at the conclusion of his books, even if he has to insert extraordinary coincidences involving even minor characters, is a flaw that even I can discern, but this flaw of structure is not something fatal enough to destroy my enjoyment of his the work.

There is probably the fourth level to Dickens that scholars have noticed but which I will never discover by myself. Here we get into the writer’s psyche such as whether certain characters reflect Dickens’s own issues with his family’s poverty and his father’s time in a debtor’s prison and his relationship to his mother and so on. This is where really serious scholars of Dickens come into their own, mining what is known of his life to discover the hidden subtext of his novels.

My inability to scale these heights on my own is the reason why there are some writers who are stated to be geniuses whom I simply cannot appreciate. Take William Faulkner. I have read his novels The Sound and the Fury and As I Lay Dying and his short stories A Rose for Miss Emily and Barn Burning but I just don’t get his appeal.

In fact, I find his writing sometimes downright annoying. At the risk of incurring the wrath of the many zealous Faulkner fans out there, I think that Faulkner does not play fair with his readers, deliberately misleading them seemingly for no discernible reason. In The Sound and the Fury, for example, he abruptly keeps switching narrators on you without warning, each with their own stream of consciousness, but you soon get the hang of that and can deal with it. But what really annoyed me was that he has two characters have the same name but be of different genders and of different generations but this fact is not revealed until the very end. Since this character is central to the story and is referred to constantly by the different narrators, I was confused pretty much all the way through as to what was going on, since I had naively assumed that the references were to the same person, and the allusions to that person did not fit any coherent pattern. As a result, I found it hard to make sense of the story and that ruined it for me. I could not see any deep reason for this plot device other than to completely confuse the reader. I felt tricked at the end and I had no desire to re-read the book with this fresh understanding in mind.

This is not to say that writers should never misdirect their readers but there should be good reasons for doing so. I grew up devouring mystery fiction and those novels also hide some facts from their readers and drop red herrings in order to provide the dramatic denouement at the end. But that genre has fairly strict rules about what is ‘fair’ when doing this and what Faulkner did in The Sound and the Fury would be considered out of bounds.

More sophisticated readers insist to me that Faulkner is a genius for the way he recreates the world of rural Mississippi, the people and places and language of that time. That may well be true but that is not enough for me to like an author. When my low-level needs of story and basic message are not met, I simply cannot appreciate the higher levels of technique and deep meaning. Furthermore, there is rarely a sympathetic character in his stories. They all tend to be pathological and weird, which makes it even harder to relate to them.

I had similar problems with Melville’s Moby Dick. For example, right at the beginning there are mysterious shadowy figures that board the ship and enter Captain Ahab’s cabin but they never appear afterwards although it does not appear that they left the ship prior to its departure. What happened to them? What was their purpose? And what do all the details about whaling (that make the book seem like a textbook on the whaling industry) add to the story? Again, the main characters were kind of weird and unsympathetic and I finished the book feeling very dissatisfied.

James Joyce’s Ulysses seems to me to be a pure exercise in technique and deep meaning that is probably a delight for scholars to pick through and interpret and search for hidden meanings, but that kind of thing leaves me cold. I simply could not get through it, and also failed miserably with The Portrait of the Artist as a Young Man.

Gabriel Garcia Marquez in his book Love in the Time of Cholera pulls a stunt similar to Melville. His opening chapter introduces some intriguing and mysterious characters who then disappear, never to appear again or be connected with the narrative in even the most oblique way. I kept expecting them to become relevant to the story, to tie some strands together, but they never did and I was left at the end feeling high and dry. Why were they introduced? What purpose were they meant to serve? Again, people tell me that Marquez is great at evoking a particular time and place, and I can see that. But what about the basic storytelling nature of fiction? When that does not make sense, I end up feeling dissatisfied.

I also have difficulty with the technique of ‘magic realism’ as practiced by Marquez in his A Hundred Years of Solitude and Salman Rushdie in The Satanic Verses. In this genre you have supernatural events, like ghosts appearing and talking to people, or people turning into animals and back again, and other weird and miraculous things, and the characters in the books treat these events as fairly routine and humdrum. I find that difficult to accept. I realize that these things are meant to be metaphors and deeply symbolic in some way, but I just don’t get it. These kinds of literary devices simply don’t appeal to me.

This is different from (say) Shakespeare’s plays, which I do enjoy. He too often invokes ghosts and spirits in some of his plays but these things are easily seen as driving the story forward so it is easy to assimilate their presence. Even though I don’t believe in the existence of the supernatural, the people of his time actually believed in those things and the reactions of the characters in his plays to the appearance of these ghosts and fairies seem consistent with their beliefs. But in a novel like The Satanic Verses that takes place in modern times, to have a character turn into a man-goat hybrid and back to fully man again with the other characters responding with only mild incredulity and not contacting the medical authorities, seems a little bizarre.

I would hasten to add that I am not questioning the judgment of experts that Faulkner and Melville and Joyce and Marquez and Rushdie are all excellent writers. One of the things about working at a university is that you realize that the people who study subjects in depth usually have good reasons for their judgments and that they are not mere opinions to be swept aside just because you happen to not agree with them. One does not go against an academic consensus without marshalling good reasons for doing so and my critiques of these writers are at a fairly low level and come nowhere close to being a serious argument against them. What I am saying is that for me personally, a creative work has to be accessible at the two lowest levels for me to enjoy it.

I think that there are two kinds of books and films. One the one hand there are those that can be enjoyed and appreciated by low-brow people like me on our own, and others that are best appreciated when accompanied by discussions led by people who have studied those books and authors and films and directors and know how to deal with them on a high level.

A low-brow view of films

Although I watch a lot of films, I realized a long time ago that my appreciation of films (or plays or books or concerts) was decidedly at a ‘low brow’ level. To explain what I mean, it seems to me that there are four levels in which one can appreciate a film (or play). At the lowest level is just the story or narrative. The next level above that is some message that the writer or director is trying to convey and which is usually fairly obvious. People whose appreciation does not get beyond these two levels are those I call low-brow. And I am one of them.

But I am aware there are higher levels of appreciation and criticism that can be scaled. The third level is that of technique, such as the quality of writing and things like acting and directing and cinematography and sound and lighting. And then there is the fourth and highest level, which I call deep meaning or significance, where there is a hidden message which, unlike the message at the second level, is not at all obvious but which has to be unearthed (or even invented) by scholars in the field or people who have a keen sensitivity to such things.

I almost never get beyond the first two levels. In fact, if the first level does not appeal to me, then no level of technique or profundity will rescue the experience. This does not mean that the items in the third level do not matter. They obviously are central to the enjoyment of the experience. It is just that I rarely notice the third level items unless they are so bad that it ruins the storytelling aspect. If the dialogue or acting (for example) is really rotten, then I will notice it but if I don’t notice these things at all, then it means that they were good.

But I don’t even consider these things unless the first two levels are satisfactory. If the first two levels are bad, nothing at the higher levels can salvage the experience for me. I never leave a film saying things like “The story was awful but the camerawork was excellent.”

As an example, I really enjoy Alfred Hitchcock’s films and have seen nearly all of them, many multiple times. But I just enjoy the way he tells the stories. Since I enjoy reading about films after I have watched them, I often find people pointing out subtle effects of technique such as how he uses lighting or sets up a camera angle or how he creates a mood, and so on. While I enjoy having these things pointed out to me, I would never notice them on my own.

The same thing holds with the music soundtrack. When friends tell me that they enjoyed the soundtrack of a film that is not a musical, my usual response is “what soundtrack?” The only films in which I notice the soundtrack are those in which there are obvious songs, such as in (say) The Graduate or Midnight Cowboy, the latter having a wonderful theme song Everybody’s Talkin’ by Harry Nillson and a beautifully haunting harmonica score that so pervades the film that even I noticed it.

The same happens with the fourth level of analysis, which is even more inaccessible to me. Just recently I read that in several of Hitchcock’s films, he was exploring homosexual themes. I had no idea and would never have figured that out on my own. While I have no talent for exploring these deeper levels of meaning, I appreciate the fact that there are people who can do so and are willing to share that knowledge. Reading them and talking about films with such knowledgeable and keenly observant people is a real pleasure.

I once had pretensions to ‘higher criticism’ (which deals with the third and fourth levels) myself but that ended one day when it became dramatically obvious that I had no clue how to do it. It was in 1975 when I watched the film If. . . (1968) by director Lindsay Anderson. I like Anderson’s films a lot. He creates strange and quirky films that deal with class politics in Britain, such as This Sporting Life (1963) and O Lucky Man (1973). The last one has an absolutely brilliant soundtrack and I noticed it because it consists of songs sung by British rocker Alan Price and he and his group periodically appear in the film to sing them, so you can’t miss the music. It is one of the rare CDs I bought of a film soundtrack, it was so good.

Anyway, my friends and I watched If. . . and we noticed that while most of the film was in color, some of the scenes were in black and white. We spent a long time afterwards trying to determine the significance of this, with me coming up with more and more elaborate explanations for the director’s intent, trying to make my theories fit the narrative. By an odd coincidence, soon after that I read an article that explained everything. It said that while making the film, Anderson had run low on money and had had to complete shooting with cheaper black and white film. Since films are shot out of sequence, the final product had this mix of color and black and white footage. That was it, the whole explanation, making laughable my elaborate theories about directorial intent. It was then that I gave up on the higher criticism, realizing that I would simply be making a fool of myself.

There are some films that are self-consciously technique-oriented, and I can appreciate them as such. For example Memento and Mulholland Drive are films that are clearly designed by the director to have the viewer try and figure out what is going on. They are like puzzles and I can enjoy them because they are essentially mystery stories (one of my favorite genres) in which the goal is to determine the director’s intent and methods used. Both films were a lot of fun to watch and grapple with.

But except in those special cases, I leave ‘higher criticism’ to those better equipped to do so. That is the nice thing about creative works of art. One can appreciate and enjoy them at so many different levels and each viewer or reader can select the level that best suits them.

Next: A low-brow view of books.