Changing notions of death-1: Brain death

There is nothing more bracing than starting a new week with the cheery topic of death. I have been thinking about it since listening to noted ethicist Peter Singer’s excellent talk on The ethics of life and death on March 21. He pointed out that the answer to the question “When is someone dead?” is not simple.

Most of us know, by listening to the abortion debate in the US, how hard it is to get agreement on when life begins. Singer’s talk highlighted the other problem, one that does not get nearly as much attention, and that is the question of how we decide that someone is dead.

(Caveat: I could only stay for the first 45 minutes of his talk and did not take notes, so my use of the ideas in his talk is based on my memory. Peter Singer is not to be blamed for any views that I may inadvertently ascribe to him. But his ideas were so provocative that I had to share and build on them. I can see why he is regarded as one of the premier ethical thinkers.)

It used to be that the definition of death was when the heart stopped beating and blood stopped flowing. But that definition was changed so that people whose hearts were still beating but whose brains had no activity were also deemed to be dead.

This change was implemented in 1980 by the Uniform Determination of Death Act, which was supported by the President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behavioral Research. This act asserts that: “An individual, who has sustained either (1) irreversible cessation of circulatory and respiratory functions, or (2) irreversible cessation of all functions of the entire brain, including the brainstem, is dead. A determination of death must be made in accordance with accepted medical standards.”

Why did this change come about? Singer says that the background to this change raises some serious ethical questions. Thinking about changes in the definition of death was triggered by the first heart transplant operation done in 1967 by Dr. Christian Barnard in South Africa. Suddenly, the possibility of harvesting human hearts and other organs of dead people for use by others became much more realistic and feasible. But if you waited for the heart to stop beating to determine death, then that left you very little time to get a useful organ (because organs decay rapidly once blood stops flowing), whereas if people were merely ‘brain dead’ than you could get organs while they were still fresh and warm, since the circulatory system was still functioning at the time of removal.

Thus the first heart transplant in 1967 was the main impetus for the formation in 1968 of an ad hoc committee on brain death at Harvard Medical School, which laid the foundation for the shift in the definition of death that occurred in 1980 which provided criteria that described determination of a condition known as “irreversible coma,” “cerebral death,” or brain death.

Note that the change in the definition of death was not due to purely better scientific knowledge of when people died. All that science could say was that from past experience, a person who was ‘brain dead’ had never ever come back to a functioning state. It seems like the decision to change the definition of death was (at least partly) inspired by somewhat more practical considerations involving the need of organs for transplants.

But while the circumstances behind the change in the definition of death raises serious ethical questions, the idea that someone who was ‘brain dead’ was truly dead was a defensible proposition, whatever the reasons for its adoption.

To be continued. . .

POST SCRIPT: Quick! Get back in the closet!

Some time ago, I expressed surprise that some atheists felt uneasy about ‘coming out of the closet.’ But a new University of Minnesota study suggests that there may be good reason for their hesitancy.

From a telephone sampling of more than 2,000 households, university researchers found that Americans rate atheists below Muslims, recent immigrants, gays and lesbians and other minority groups in “sharing their vision of American society.” Atheists are also the minority group most Americans are least willing to allow their children to marry
. . .
Many of the study’s respondents associated atheism with an array of moral indiscretions ranging from criminal behavior to rampant materialism and cultural elitism.

These results are quite amazing. Of course, such negative stereotypes usually arise from ignorance so maybe if people encountered more atheists and saw how ordinary they are, this view could be dispelled. But it is interesting how so many people feel that god is so integral to their “vision of American society.” America seems to be a theocracy, in fact, if not legally.

Grade inflation-3: How do we independently measure learning?

Recall (see here and here for previous postings) that to argue that grade inflation has occurred, it is not sufficient to simply show that grades have risen. It must be shown grades have risen without a corresponding increase in learning and student achievement. And that is difficult to do because there are really no good independent measures of student learning, apart from grades.

Some have argued that the SAT scores of matriculating classes could be used as a measure of student ‘ability’ and could thus be used to see if universities are getting ‘better’ students, thus justifying the rise in grades.

But the use of SAT scores as a measure of student quality or abilities has always been deeply problematic, so it is not even clear that any rise in SAT scores of incoming students means anything. One reason is that the students who take the SAT tests are a self-selected group and not a random sample, so one cannot infer much from changes in SAT scores. Second, SAT scores have not been shown to be predictive of anything really useful. There is a mild correlation of SAT scores with first year college grades but that is about it.

Even at Case, not all matriculating students have taken the SAT’s. Also the average total SAT scores from 1985-1992 was 1271, while the average from 1993-2005 was 1321. This rise in SAT scores of incoming students at Case would be affected by two factors, the first being the re-centering of SAT scores that occurred in 1995. It is not known whether the pre-1995 scores we have at Case are the original ones or have been raised to adjust for re-centering. This lack of knowledge makes it hard to draw conclusions about how much, if at all, SAT scores have risen at Case.

Alfie Kohn cites “Trends in College Admissions” reports that say that the average verbal-SAT score of students enrolled in all private colleges rose from 543 in 1985 to 558 in 1999. It is also the fact that it was around 1991 that Case instituted merit scholarships based on SAT scores and started aggressively marketing it as a recruiting tool. So it is tempting to argue that there has been a genuine rise in SAT scores for students at Case.

Another local factor at Case that would influence GPAs is the practice of “freshman forgiveness” that began in 1987. Under this program, students in their first year would be “forgiven” any F grades they received and this F would not be counted towards their GPA. This is bound to have the effect of increasing the overall GPA, although a very rough estimate suggests only a 1-2% increase. This practice was terminated in 2005.

The Rosovsky-Hartley monograph points to the fact that many more students in colleges are now enrolled in remedial courses than was the case in the past, arguing that this implies that students are actually worse now. But again, that inference is not clear. Over the recent past there has been a definite shift in emphasis in colleges of now wanting to retain the students they recruit. The old model of colleges recruiting more students than they needed and then ‘weeding’ them out using certain courses in their first year, is no longer in vogue, assuming that there was substance to that belief and it is not just folklore.

Now universities go to great lengths to provide assistance to their students, beefing up their advising, tutoring, and other programs to help student stay in school. So the increased enrollment of students in remedial courses may simply be the consequence of universities taking a much more proactive attitude to helping students, rather than a sign of declining student quality. All these measures are aimed at improving student performance and are another possible benign explanation for any rise in grades. In fact, all these remedial and assistance programs could be used to argue that a rise in grades could be due to actual improved student performance.

Alfie Kohn argues that taking all these things into account, there is no evidence for grade inflation, that this is an issue that has been blown way out of proportion by those who have a very narrow concept of the role of grades in learning. Kohn says there are many reasons why grades could rise:

Maybe students are turning in better assignments. Maybe instructors used to be too stingy with their marks and have become more reasonable. Maybe the concept of assessment itself has evolved, so that today it is more a means for allowing students to demonstrate what they know rather than for sorting them or “catching them out.” (The real question, then, is why we spent so many years trying to make good students look bad.) Maybe students aren’t forced to take as many courses outside their primary areas of interest in which they didn’t fare as well. Maybe struggling students are now able to withdraw from a course before a poor grade appears on their transcripts. (Say what you will about that practice, it challenges the hypothesis that the grades students receive in the courses they complete are inflated.)

The bottom line: No one has ever demonstrated that students today get A’s for the same work that used to receive B’s or C’s. We simply do not have the data to support such a claim.

In addition to the factors listed by Kohn, psychologist Steve Falkenberg points out a number of other reasons why average grades could rise. His essay is a particularly thoughtful one that is worth reading.

Part of the problem in judging whether grade inflation exists is that we don’t know what the actual grade distribution in colleges should be. Those who argue that it should be a bell curve (or ‘normal’ distribution) with an average around C are mixing up a normative approach to assessment (as is used for IQ tests and SATs) with an achievement approach.

IQ tests and SATs are designed so that the results are spread out over a bell curve. They seek to measure a characteristic (called “intelligence'”) that is supposedly distributed randomly in the population according to a normal distribution. (This assumption and the whole issue of what constitutes intelligence is the source of a huge controversy that I don’t want to get into here.) So the goal of such tests is to sort students into a hierarchy, and they design tests that spread out the scores so that one can tell who is in the top 10% and so on.

But when you teach a class of students, you are no longer dealing with a random sample of the population. First of all, you are not giving your assessments to people off the street. The students have been selected based on their prior achievements and are no longer a random sampling of the population. Secondly, by teaching them, you are deliberately intervening and skewing the distribution. Thirdly, your tests should not be measuring the same random variable that things like the SATs measure. If they were, you might as well give your students their grades based on those tests.

Tests should not be measures of some intrinsic ability, even assuming that such a thing exists and can be measured and a number assigned to it. Tests are (or at least should be) measuring achievement of how much and how well a selected group of students have learned as a result of your instruction. Hence there is no reason at all to expect a normal distribution. In fact, you would expect to have a distribution that is skewed towards the high end. The problem, if it can be considered a problem, is that we don’t know a priori what that skewed distribution should look like or whether there is a preferred distribution at all. After all, there is nothing intrinsically wrong with everyone in a class getting As, if they have all learned the material at a suitably high level.

In fact, as Ohmer Milton, Howard Pollio, and James Eison write in Making Sense of College Grades (Jossey-Bass, 1986): “It is not a symbol of rigor to have grades fall into a ‘normal’ distribution; rather, it is a symbol of failure — failure to teach well, failure to test well, and failure to have any influence at all on the intellectual lives of students.”

There is nothing intrinsically noble about trying to keep average grades unchanged over the years, which is what those who complain about grade inflation usually want to do.

On the other hand, one could make the reasonable case that as we get better at teaching and in creating the conditions that make students learn better, and as a consequence we get students who are able to learn more, then perhaps we should raise our expectations of students and provide more challenging assignments, so that they can rise to greater heights. This is a completely different discussion. If we do so, this might result in a drop in grades. But this drop is a byproduct of a thoughtful decision to make learning better, not caused by an arbitrary decision to keep average grades fixed.

This approach would be like car manufacturers and consumers raising their standards over the years so that we now expect a lot more from our cars than we did fifty years ago. Even the best cars of fifty years ago would not be able to meet the current standards of fuel efficiency, safety, and emissions. But the important thing to keep in mind is that standards have been raised along with the ability to make better cars able to meet the higher standards.

But in order to take this approach in education, it requires teachers to think carefully about what and how we assess, what we can reasonably expect of our students, and how we should teach so they can learn more and learn better. Unfortunately much of the discussion of grade inflation short-circuits this worthwhile aspect of the issue, choosing instead to go for the quick fix like putting limits for the number of grades awarded in each category.

It is perhaps worthwhile to remember that fears about grade inflation, that high grades are being given for poor quality work, have been around for a long time, especially at elite institutions. The Report of the Committee on Raising the Standard at Harvard University said: “Grades A and B are sometimes given too readily — Grade A for work of no very high merit, and Grade B for work not far above mediocrity. … One of the chief obstacles to raising the standards of the degree is the readiness with which insincere students gain passable grades by sham work.”

That statement was made in 1894.

POST SCRIPT: Cindy Sheehan in Cleveland tomorrow

Cindy Sheehan will speak at a Cleveland Town Hall Meeting Saturday, March 25, 1-3 pm

Progressive Democrats of Ohio present Gold Star Mother and PDA Board Member Cindy Sheehan at a Town Hall Meeting on Saturday, March 25, 2006 from 1 – 3 p.m. at the Beachland Ballroom, 15711 Waterloo Road in Cleveland’s North Collinwood neighborhood. (directions.)

Topic: Examining The Cost of Iraq: Lives, Jobs, Security, Community

Panelists include:

US Congressman Dennis Kucinich, OH-10
Cindy Sheehan – Gold Star mother & activist
Tim Carpenter, National Director, Progressive Democrats of America
Francis Chiappa, President, Cleveland Peace Action
Paul Schroeder, NE Ohio Gold Star Father and co-founder of Families of the Fallen For Change
Farhad Sethna, Immigration attorney and concerned citizen

Grade inflation-2: Possible benign causes for grade increases

Before jumping to the conclusion that a rise in average grades must imply inflation (see my previous posting on this topic), we should be aware of the dangers that exist when we are dealing with averages. For example, suppose we consider a hypothetical institution that has just two departments A and B. Historically, students taking courses in A have had average grades of 2.5 while those in B have had 3.0. Even if there is no change at all in the abilities or effort of the students and no change in what the faculty teach or the way that faculty assess and grade, so that the average grades in each department remain unchanged, it is still possible for the average grades of the institution to rise, simply because the fraction of students taking courses in B has become larger.

There is evidence that this shifting around in the courses taken by students is just what is happening. Those who are convinced that grade inflation exists and that it is evil, tend to interpret this phenomenon as game playing by students, that they are manipulating the system, choosing courses on the basis of how easy it is to get high grades rather than by interest or challenge.

For example, the ERIC report says “In Grade Inflation: A Crisis in College Education (2003), professor Valen E. Johnson concludes that disparities in grading affect the way students complete course evaluation forms and result in inequitable faculty evaluations. . . Students are currently able to manipulate their grade point averages through the judicious choice of their classes rather than through moderating their efforts. Academic standards have been diminished and this diminution can be halted, he argues, only if more principled student grading practices are adopted and if faculty evaluations become more closely linked to student achievement.”

This looks bad and the author obviously wants to make it look bad, as can be seen from his choice of the word ‘manipulate’ to describe the students’ actions and the way he implies that faculty are becoming more unprincipled in their grading practices. But there is no evidence for the evil motivations attributed to such students and faculty. In fact, one can look at the phenomenon in a different way. It is undoubtedly true that students now have many more choices than they did in the past. There are more majors and more electives. When you offer more choices, students are more likely to choose courses they are interested in and thus are more likely to do better in them.

Furthermore, even if students are choosing courses partly based on their expectation of the grade they will receive in it, we should not be too harsh in our judgments. After all, we have created a system in which grades seem to be the basis for almost everything: admission to colleges and graduate schools, honors, scholarships, and financial aid. As I said, grades have become the currency of higher education. Is it any wonder that students factor in grades when making their choices? If a student tries to balance courses they really want to take with those that know they can get a high grade in order to be able to maintain the GPA they need to retain their scholarships, why is this to be condemned? This seems to me to be a sensible strategy. After all, faculty do that kind of thing all the time. When faculty learn that the NIH or NSF is shifting its grants funding emphasis to some new research area, many will shift their research programs accordingly. We do not pour scorn on them for this, telling them that they should choose research topics purely based on their interests. Instead, we commend them for being forward thinking.

It certainly would be wonderful if students chose courses purely on the basis of their interest or usefulness or challenge and not on grade expectations, but to put students in the current grade-focused environment and expect them to ignore grades altogether when making their course selection is to be hypocritical and send mixed messages.

What about the idea that faculty grading standards have declined and that part of the reason is that they are giving easy grades in order to get good evaluations? This is a very popular piece of folklore on college campuses. But this question has also been studied and the data simply do not support it. It does seem to be true that students tend to get higher grades in the courses in the courses they rate higher. But to infer a causal relationship, that if a faculty member gives higher grades they will get better evaluations, is wrong.

People who have studied this find that if a student likes a course and a professor (and thus gives good evaluations), then they will tend to work harder at that course and do better (and thus get higher grades) thus bringing about the grades-evaluations correlation that we see. But what tends to determine how much a student likes a course and professor seems to depend on whether they student feels like she or he is actually leaning interesting and useful stuff. Students, like anybody else, don’t like to feel they are wasting their time and money and do not enjoy being with a professor who does not care for them or respect them.

Remember that these studies report on general trends. It is quite likely that there exist individual professors who give high grades in a misguided attempt to bribe student to give good evaluations, and that there exist students willing to be so bribed. But such people are not the norm.

To be continued. . .

POST SCRIPT: And the winner is. . .

Meanwhile, there are Americans who have already have decided which country the US should invade next in the global war on terror, even if they haven’t the faintest idea where that country is on the globe or why it should be invaded. Even Sri Lanka gets a shot at this particularly dubious honor.

Here’s an idea for a reality show, along the lines of American Idol. It will be called Who’s next?. The contestants will be the heads of states of each country and this time their goal will be to get voted off because the last remaining country gets bombed and invaded by the US. The judges could be Dick Cheney (to provide the sarcastic put-downs a la Simon Cowell. Sample: “You think we’re going to waste our smart bombs on your dumb country?”), Donald Rumsfeld, and Condoleeza Rice.

Fox television, my contract is ready and I’m waiting for your call.

Grade inflation-1: What is it and is it occurring?

There is nothing that gets the juices flowing in higher education academic circles than the topic of grade inflation. Part of the reason for this passion may be because grades and test scores, and not learning, seem to have become the currency of education, dominating the thinking of both students and faculty. Hence some people monitor grades as an important symptom of the health of universities.

But what is curious is that much of the discussion is done in the absence of any hard data. It seems as if perceptions or personal anecdotes are a sufficient basis to draw quite sweeping conclusions and prescriptions for action.

One of the interesting things about the discussion is how dismayed some faculty get simply by the prospect that average grades have risen over time. I do not quite understand this. Is education the only profession where evidence, at least on the surface, of a rise in quality is seen as a bad thing? I would hope that like in every other profession we teachers are getting better at what we do. I would hope that we now understand better the conditions under which students learn best and have incorporated the results of that knowledge into our classrooms, resulting in higher achievement by students. Any other profession or industry would welcome the news that fewer people are doing poorly or that fewer products are rejected for not meeting quality standards. But in higher education, rising grades are simply assumed to be bad.

Of course, if grades are rising because our assessment practices are becoming lax, then that is a cause for concern, just as if a manufacturer reduces the rejection rate of their product by lowering quality standards. This is why having an independent measure of student learning and achievement to compare grades with has to be an important part of the discussion.

Grade inflation is a concept that has an analogy with monetary inflation, and to infer that inflation (as opposed to just a rise) in grades has occurred implies that grades have risen without a corresponding increase in learning and student achievement. But in much of the discussion, this important conditional clause is dropped and a rise in grades is taken as sufficient evidence by itself that inflation has occurred.

Let’s take first the question of whether average grades have actually risen. At Case, as some of you may know, beginning January 2003, the GPA cutoffs to achieve honors were raised to 3.56 (cum laude), 3.75 (magna cum laude), and 3.88 (summa cum laude) so that only 35% of students would be eligible for honors. (The earlier values were 3.20, 3.50, and 3.80 respectively.) This measure was taken because the number of people who were graduating with honors had risen steadily over the years, well above the 35% originally anticipated when the earlier bars were set.

A look at grade point averages at Case shows that it was 2.99 in 1975 (the first year for which we have this data), dropped slowly and steadily to 2.70 in 1982, rose to 3.02 in 1987, stayed around that value until 1997, and since then has oscillated around 3.20 until 2005, with the highest reaching 3.27 in 2001. The overall average for the entire period was 3.01 and the standard deviation was about 0.70. (I am grateful for this and other Case data to Dr. Julie Petek, Director of Degree Audit and Data Services.)

It is hard to know what to make of this. On the one hand we could start at the lowest point in the grade curve and say that grades have risen by half a letter grade from 1982 to 2005. Or we could start at 1975 and say that grades are fluctuating in the range 2.70-3.30, or about half a standard deviation about the mean of 3.0.

What does the data say nationwide? Henry Rosovsky and Matthew Hartley, writing in a monograph for the American Academy of Arts and Sciences are convinced that inflation has occurred. For evidence of grades increasing, they point to various nationwide surveys that show that average grades rose by 0.43 from 1960 to 1974; that in 1993 the number of grades of A- or higher was 26%, compared to 7% in 1969; and the number of C’s dropped from 25% to 9% in that same period; and that averages rose from 3.07 in the mid 1980s to 3.34 in the mid 1990s.

In this last result, it is interesting to note in another study that grades rose on average only at selective liberal arts colleges and research universities, while they declined at general liberal arts colleges and 
comprehensive colleges and universities, and in the humanities and social sciences

The Rosovsky-Hartley monograph has been criticized for depending on research that itself depended on studies that used surveys, and such studies can be questioned on the fact that it is not clear how reliable the responses to such surveys are, depending as they do on self-reporting.

A 2003 ERIC Digest of the literature on this topic finds results that that cast doubt on the basic question of whether average grades have even risen. For example, “Clifford Adelman, a senior research analyst with the U.S. Department of Education, reviewed student transcripts from more than 3,000 colleges and universities and reported in 1995 that student grades have actually declined slightly over the last 20 years.” (my emphasis). His study of 16.5 million graduates from 1999-2000 also found that 14.5% of these students received As while 33.5% received grades of C or lower.

What is significant about the Adelman study is that he used actual student transcripts, not surveys, and thus seems to me to be more reliable.

It seems from this data and other studies that average grades have not increased across the board, but it is plausible that they have increased at selective liberal arts colleges and research universities. The Rosovsky-Hartley monograph says that “In 1966, 22 percent of all grades given to Harvard undergraduates were in the A range. By 1996 that percentage had risen to 46 percent and in that same year 82 percent of Harvard seniors graduated with academic honors. In 1973, 30.7 percent of all grades at Princeton were in the A range and by 1997 that percentage had risen to 42.5 percent.”
Even though it has not been conclusively established suppose that, for the sake of the argument, we concede that at least at selective liberal arts colleges and research universities (such as Case) have seen a rise in average grades. Is this automatically evidence of grade inflation? Or are there more benign causes, such as that we getting better prepared and more able students now, or our teaching methods have improved? Another important issue is whether Case’s experience of rising grades is part of a national trend or an exception.

To be continued. . .

POST SCRIPT: Where’s the balance?

Over at Hullabaloo, Tristero catches the Washington Post in a blatant act of bias in favor of atheistic science. The Post article says:

Scientists said yesterday they have found the best evidence yet supporting the theory that about 13.7 billion years ago, the universe suddenly expanded from the size of a marble to the size of the cosmos in less than a trillionth of a second.

Tristero points out that the article completely fails to mention the controversy around this question, that there is another side to this story, that the big bang is “only a theory” since no one was there to actually observe this event and prove that it happened.

And not a word of balance from the other side, as if the sincere faith of millions of Americans in a Christian God didn’t matter at all to the Post’s editors.

I just hate it when the media reports carefully vetted scientific data as fact and not as just one of many valid points of view. I’m not asking for them to ignore the opinions of these so-called scientists, but they really should report the fact there’s a lot of controversy about whether this kind of evidence is valid. Like, were you there, huh, Mr. Hotshot Washington Post?

Is god punishing Kansas?

The last year has seen too many examples of the devastating power that nature can unleash. The Asian tsunami, hurricane Katrina, and the earthquake in Pakistan all caused massive destruction and loss of life, leaving hundreds of thousands of people homeless and impoverished and grieving for lost loved ones.

From time immemorial, people have looked on such events as indicators of god’s feelings towards humans. Of course, some despicable people have tried to use this natural tendency to seek signs of god’s power to actually wish for some calamity to strike some community as a sign of god’s righteous anger. Pat Robertson is just one of those despicable people who seems to think that what makes him angry must also be what makes god angry.

Recall that he was upset with the citizens of Dover for the way they voted out the school board that had adopted the pro-IDC (intelligent design creationism) statement in biology classes. He said: “I’d like to say to the good citizens of Dover. If there is a disaster in your area, don’t turn to God, you just rejected Him from your city. And don’t wonder why He hasn’t helped you when problems begin, if they begin. I’m not saying they will, but if they do, just remember, you just voted God out of your city. And if that’s the case, don’t ask for His help because he might not be there.” (See here for the video.)

But unfortunately for Robertson, God’s anger seemed to have been directed not at the rebellious people of Dover, but at the state of Kansas, which was recently hit by a series of over 100 deadly tornadoes that raced through the midwest, causing widespread destruction, most of it in Kansas.

The irony here is that Kansas is, according to Robertson’s measures, perhaps the most pro-god state in the nation, inserting IDC-friendly language into its science standards and even going so far as to rewrite the definition of science, so that it is no longer limited to the search for natural explanations of phenomena.

The state of Kansas did not rest on these laurels, but in addition overwhelmingly passed (with 71% of the vote) a constitutional amendment banning same-sex couples from marrying or entering into civil unions. There was already an unchallenged law on the books that did the same thing but the people of Kansas wanted to send an even stronger signal to god of their devoted opposition to gay rights.

As Rep. Steve Huebert said “It is the right thing to do, based on the truth that was spoken (in the Bible).”

Kansas is even the home of Reverend Fred Phelps and his merry flock of gay-hating funeral disrupters.

How much more can a state do to please the almighty? So if any state felt entitled to special favors from god, it was Kansas. Getting hit in return with a series of tornados seems like a pretty ungrateful act on the deity’s part.

So what will Robertson say about Kansas, and not Dover, being the target of tornados? Perhaps he will say that the people of Kansas must be harboring some secret sins that made them deserving of this.

Or maybe god was aiming the tornados at Dover but they veered unexpectedly away, thus making Kansas the victim of friendly fire, so to speak. (Coriolis forces are tricky to account for when you are dealing with fast flowing wind currents, and have caused many a seasoned meteorologist to err in their predictions of where and when a storm will hit.)

Or maybe the good people of Kansas, and Pat Robertson, may consider the possibility that this evidence suggests that god is sending a message that he or she is actually strongly pro-gay and anti-IDC.

Or maybe, just maybe, these kinds of things are called ‘natural disasters’ for a reason, not just because they involve nature, but because they have no supernatural cause.

POST SCRIPT: Peter Singer lecture today

Peter Singer, the well-known ethicist, will give a lecture today on “Our Changing Ethics of Life and Death” at 4:00 p.m., Ford Auditorium, Allen Memorial Medical Library, 11000 Euclid Avenue, Cleveland

The talk is free and open to the public. No reservations required.

For more details, see here.

The politics of fear bites back

If there is anything that shows how cynical and manipulative the politics of fear have become, it is the controversy of the Dubai-based company Dubai Ports World taking over management of some US ports. As everyone knows, that company has now said that because of the huge negative reaction, they are handing over that operation to a US-based entity, although the details of that transfer have not been released.

This is an example of the chickens coming home to roost for this administration. To understand this, we have to go back to the events of 9/11. One way to view that disaster was to see it as a criminal act for which the perpetrators had to be sought ad brought to justice, like Timothy McVeigh was for the Oklahoma City bombing.

But that would not have served the purposes of the neoconservatives who needed a grand enemy in order to pursue their vision of global conquest. Treating it as a criminal matter in which the international law enforcement agencies would be involved, would not enable them to go on their preferred route of global conquest. In wars, innocent people die and in order to get the public to accept this, one needs an undifferentiated enemy so that anyone who gets killed can be seen as somehow deserving of it, if only by virtue of sharing some characteristics with those who actually carried out the crimes.

So a global enemy had to be manufactured and portrayed as this vast shadowy conspiracy seeking to undermine and then overthrow America, so that the only appropriate response was to go to war. This war was initially marketed to the public as the “global war on terror.” Attempts were made last year to change the brand name to the “struggle against violent extremism,” perhaps because the acronym SAVE tested better in market research than GWOT. But that change seems to have been nixed by President Bush perhaps because, as Jon Stewart said, Bush likes to think of himself as a “war president” and not a “struggling president.” The latest attempt at a brand name change is to call it “the long war”. This change has been proposed just this year and we’ll have to see if it takes root.

In this war, the undifferentiated enemy necessarily had to be Muslims and Arabs because the middle east was the target. But despite lip service to the notion that not all Muslims and Arabs were being targeted, the rhetoric of the war on terror and the need to try and link al Qaida to Iraq, inevitably led to that particular group of people being demonized.

Wars and warmongers have little use for subtleties. The fact is that much of the Muslim world is cosmopolitan, modern, linked integrally into the world trade system, and have thriving economies, as was the case with Iraq before the first Gulf war in 1991 and the subsequent imposition of sanctions.

But the Dubai Ports World deal has exposed the essential fraudulence of the so-called war on terror. The general public has been conditioned to think of Muslims and Arabs as malevolent beings and potential terrorists and thus there inevitably was a huge outcry at allowing such people access to American ports. And the White House and congressional Republicans and Democrats are responding hypocritically to this reaction.

This administration used a bludgeon against those who argued that acts of terrorism required a nuanced approach, accusing them of not being tough enough or not understanding the dangers the country faced from this vast global enemy coming out of the middle east. Now the same administration is aggrieved that people are not taking a nuanced approach to its dealings with the Arab world.

William Greider, writing in The Nation magazine, ridicules conservative columnist David Brooks for saying that the adverse reaction to the ports deal was an example of political hysteria because the “experts” tell [Brooks] there is no security risk involved. Greider writes:

Of course, he is correct. But what a killjoy. This is a fun flap, the kind that brings us together. Republicans and Democrats are frothing in unison, instead of polarizing incivilities. Together, they are all thumping righteously on the poor President. I expect he will fold or at least retreat tactically by ordering further investigation. The issue is indeed trivial. But Bush cannot escape the basic contradiction, because this dilemma is fundamental to his presidency.

A conservative blaming hysteria is hysterical, when you think about it, and a bit late. Hysteria launched Bush’s invasion of Iraq. It created that monstrosity called Homeland Security and pumped up defense spending by more than 40 percent. Hysteria has been used to realign US foreign policy for permanent imperial war-making, whenever and wherever we find something frightening afoot in the world. Hysteria will justify the “long war” now fondly embraced by Field Marshal Rumsfeld.
. . .
Bush was the principal author, along with his straight-shooting Vice President, and now he is hoisted by his own fear-mongering propaganda. The basic hysteria was invented from risks of terrorism, enlarged ridiculously by the President’s open-ended claim that we are endangered everywhere and anywhere (he decides where). Anyone who resists that proposition is a coward or, worse, a subversive. We are enticed to believe we are fighting a new cold war. But are we? People are entitled to ask. Bush picked at their emotional wounds after 9/11 and encouraged them to imagine endless versions of even-larger danger. What if someone shipped a nuke into New York Harbor? Or poured anthrax in the drinking water? OK, a lot of Americans got scared, even people who ought to know better.

So why is the fearmonger-in-chief being so casual about this Dubai business?

Because at some level of consciousness even George Bush knows the inflated fears are bogus. So do a lot of the politicians merrily throwing spears at him. He taught them how to play this game, invented the tactics and reorganized political competition as a demagogic dance of hysterical absurdities, endless opportunities to waste public money. Very few dare to challenge the mindset. Thousands have died for it.

It is interesting how even local people have picked up on how to play this game and use this hysteria to advance their own interests. In Cleveland, two companies that own commercial office space downtown are protesting a third because that company has been more successful than they at getting tenants to fill their office buildings. The complaint? The successful company is (gasp!) owned by an arm of the Kuwaiti government! Oh, the horror!

The Plain Dealer reports:

UPDATE: I have received an email from one of the people mentioned in the Plain Dealer article disputing the characterization of his views in the article. At his request, I have removed the passage.

What’s next in this anti-Muslim and anti-Arab crusade? Muslims not allowed to buy property in certain areas? Not allowed to get bank loans? Not allowed to park in handicapped spots?

How low can we go?

POST SCRIPT: Biblical inerrancy

Some time ago I wrote about Biblical inerrancy and discussed Bart Ehrman’s recent book Misquoting Jesus. Jon Stewart has an interesting interview with the author.

Iraq and Afghanistan: The Reckoning

As the third anniversary of the beginning of the military invasion of Iraq approaches on March 19, it is time to take stock of the consequences of that tragic and cruel war. Below are three items.

Before we get to that, it is sobering to recall the almost Pollyannaish hubris of the media in the early days of the invasion in April and May of 2003. On April 16, 2003, assured that things were going swimmingly in Iraq, columnist Cal Thomas took aim at those of us who opposed the war saying, using Biblical language: “All of the printed and voiced prophecies should be saved in an archive. When these false prophets again appear, they can be reminded of the error of their previous ways and at least be offered an opportunity to recant and repent. Otherwise, they will return to us in another situation where their expertise will be acknowledged, or taken for granted, but their credibility will be lacking.”
[Read more…]

Opium of the people

Most people, when they think of the Karl Marx’s attitudes towards religion, remember the quote where he refers to it as “the opium of the people.” This sounds quite dismissive. When I first heard it, I thought he meant that religion was a hallucination, similar to that caused by drugs.

But when you read the full passage that leads up to this quote, the impression shifts slightly, but in an important way. The passage is in his “Contribution to the Critique of Hegel’s Philosophy of Right” (February 1844) and and goes as follows:

Religious suffering is, at one and the same time, the expression of real suffering and a protest against real suffering. Religion is the sigh of the oppressed creature, the heart of a heartless world, and the soul of soulless conditions. It is the opium of the people.

This is a much more poetic and sympathetic view of religion than that given by just the last sentence. It speaks of religion as the solace of a suffering people, a mechanism for them to obtain relief from the forces that oppress them, to endure suffering, and something that enables them to extract some happiness from life.

(Note that when Marx wrote this, it was soon after the end of the first Opium War in China (1839-1842), where Britain put down a Chinese rebellion that was trying to end imports by British traders of opium into China, which was causing widespread addiction in that nation. He must have been aware of the fact that having the people drugged on opium enabled the relatively small British presence in China to control that vast country and its peoples. So it was a very timely metaphor.)

Of course, Marx was opposed to religion because although he saw that it met a short-term need, it hindered the ability of people to achieve a more enduring happiness. It was clear that he was not against religion just for the sake of it, but because he wanted to get rid of the terrible conditions that tempted people want to find refuge and solace in it, rather than seeking to change those very conditions. He went on:

The abolition of religion as the illusory happiness of the people is the demand for their real happiness. To call on them to give up their illusions about their condition is to call on them to give up a condition that requires illusions. The criticism of religion is, therefore, in embryo, the criticism of that vale of tears of which religion is the halo.

As Amanda Udis-Kessler writes in this commentary on Marx’s comment:

Opium, of course, provides only temporary relief for suffering, and does so by blunting the senses. In making suffering bearable, Marx argues, opium (and religion) actually can be said to be contributing to human suffering by removing the impetus to do whatever is necessary to overcome it – which, for Marx, is to relinquish religion and turn to revolutionary politics. Hamilton (1995: 82-3) points out the ultimate practical outcome of religion’s palliative function, from a Marxian perspective: “Religion offers compensation for the hardships of this life in some future life, but it makes such compensation conditional upon acceptance of the injustices of this life.”

In other words, religion serves the social function of keeping people from becoming restive about their condition and is thus conducive to maintaining social order in the face of even massive injustice.

Marx’s views on religion came to my mind when I was thinking about how the neoconservatives aligned themselves with religious Christian fundamentalists in order to achieve their political goals. The neoconservatives are grateful that religion is like opium, keeping people in a drugged, unperceptive state. It is this very feature that is attractive to those who benefit from that state of injustice.

Ronald Bailey, writing in Reason magazine argues that neoconservatives are cynically exploiting the palliative nature of religious beliefs. And to serve this end, they are even going so far as to align themselves with those religious people who are attacking Darwin. “These otherwise largely secular intellectuals may well have turned on Darwin because they have concluded that his theory of evolution undermines religious faith in society at large. . . Ironically, today many modern conservatives fervently agree with Karl Marx that religion is “the opium of the people”; they add a heartfelt, “Thank God!” “

There is reason to think that at least some of the neoconservatives are themselves not religious, but see in religion a useful tool that keeps people in line, like sheep. Their fear of what might happen if there is widespread existential angst leads them to a cynical support for the Christian fundamentalist view.

Bailey goes on:

[Neoconservative Irving] Kristol has been quite candid about his belief that religion is essential for inculcating and sustaining morality in culture. He wrote in a 1991 essay, “If there is one indisputable fact about the human condition it is that no community can survive if it is persuaded–or even if it suspects–that its members are leading meaningless lives in a meaningless universe.”
. . .
Kristol has acknowledged his intellectual debt to [Neoconservative ideologue Leo] Strauss in a recent autobiographical essay. “What made him so controversial within the academic community was his disbelief in the Enlightenment dogma that ‘the truth will make men free.’ ” Kristol adds that “Strauss was an intellectual aristocrat who thought that the truth could make some [emphasis Kristol’s] minds free, but he was convinced that there was an inherent conflict between philosophic truth and political order, and that the popularization and vulgarization of these truths might import unease, turmoil and the release of popular passions hitherto held in check by tradition and religion with utterly unpredictable, but mostly negative, consequences.”

Kristol agrees with this view. “There are different kinds of truths for different kinds of people,” he says in an interview. “There are truths appropriate for children; truths that are appropriate for students; truths that are appropriate for educated adults; and truths that are appropriate for highly educated adults, and the notion that there should be one set of truths available to everyone is a modern democratic fallacy. It doesn’t work.”
. . .
A year ago, I asked Kristol after a lecture whether he believed in God or not. He got a twinkle in his eye and responded, “I don’t believe in God, I have faith in God.” Well, faith, as it says in Hebrews 11:1, “is the substance of things hoped for, the evidence of things not seen.”

But at the recent AEI lecture, journalist Ben Wattenberg asked him the same thing. Kristol responded that “that is a stupid question,” and crisply restated his belief that religion 
is essential for maintaining social discipline. A much younger (and perhaps less circumspect) Kristol asserted in a 1949 essay that in order to prevent the 
social disarray that would occur if ordinary people lost their religious faith, “it would indeed become the duty of the wise publicly to defend and support religion.”

William Pfaff, writing on “The Long Reach of Leo Strauss” in the International Herald Tribune, traces the influence of neoconservative thinking, outlining the broader ideological framework under which religious belief is promoted by the neoconservatives:

They have a political philosophy, and the arrogance and intolerance of their actions reflect their conviction that they possess a realism and truth others lack.

They include Deputy Defense Secretary Paul Wolfowitz; Abram Shulsky of the Pentagon’s Office of Special Plans, Richard Perle of the Pentagon advisory board, Elliott Abrams of the National Security Council, and the writers Robert Kagan and William Kristol.

The main intellectual influence on the neoconservatives has been the philosopher Leo Strauss, who left Germany in 1938 and taught for many years at the University of Chicago. Several of the neoconservatives studied under him. Wolfowitz and Shulsky took doctorates under him.

Something of a cult developed around Strauss during his later years at Chicago, and he and some admirers figure in the Saul Bellow novel, “Ravelstein.” The cult is appropriate because Strauss believed that the essential truths about human society and history should be held by an elite, and withheld from others who lack the fortitude to deal with truth. Society, Strauss thought, needs consoling lies.
. . .
He also argued that Platonic truth is too hard for people to bear, and that the classical appeal to “virtue” as the object of human endeavor is unattainable. Hence it has been necessary to tell lies to people about the nature of political reality. An elite recognizes the truth, however, and keeps it to itself. This gives it insight, and implicitly power that others do not possess. This obviously is an important element in Strauss’s appeal to America’s neoconservatives.

The ostensibly hidden truth is that expediency works; there is no certain God to punish wrongdoing; and virtue is unattainable by most people. Machiavelli was right. There is a natural hierarchy of humans, and rulers must restrict free inquiry and exploit the mediocrity and vice of ordinary people so as to keep society in order.

There is something repulsive to me in the idea that there are some ideas that have to be shielded from people because they are unable to handle it. It is a dangerous and paternalistic attitude and profoundly undemocratic. But it is clear that the neoconservatives believe it. For them, the truth is not an unqualified good. Rather it is something that only a select few, who alone are wise enough to really understand it and use it, should know and those people get to decide what the general public should know, even if it is false.

To argue that one should present different truths for different people is wrong. One has an obligation to not deceive people. Of course, one may present what one perceives to be the truth to different audiences in different ways. It is like teaching the basics of mathematics or science. How I teach it to elementary school students and to doctoral students will be quite different but the ideas I try to convey should be the same.

Since the listener always interprets new knowledge in the light of his or her own experiences, we will each construct our own version of the truth.

But that is quite different from deliberately constructing different “truths” to present to different people in order to get them to conform to your will. Such things are no longer truths. They are simply manipulative lies. They are the modern-day opium of the people.

The politics of terrorism-3: A more complex picture

We have seen that in the BBC documentary The rise of the politics of fear, the main narrative is the parallel rise of two movements: one Islamic militant fundamentalism, the other neoconservative.

On the Islamic side, the movement now known as al Qaida is traced to the vision of one scholar, Syed Qutb, whose followers amalgamated his political vision with that of Islamic fundamentalism, and which became the Muslim Brotherhood. For these groups, the main enemy is the global encroachment of decadent western values into the Muslim world, aided and abetted by corrupt governments.

In the US, we had the vision of one scholar Leo Strauss, whose followers formed a secular political movement with global ambitions that allied itself with Christian religious fundamentalists as a means of achieving its political goals, thus bringing to life the neoconservative movement.

This narrative of the parallel lives of two scholars who lived at the same time (Qutb from 1906-1966 and Strauss from 1899-1973) and whose disciples have brought about this major conflict makes for compelling drama, and the documentary was absorbing. But the price it may have paid is that of oversimplification. While the idea of these two opposing strands has a dramatic point-counterpoint appeal, there is some evidence that the clash we are seeing is not simply an ideological one that pits neoconservative purists against Islamic purists.

It is clear that there are more pragmatic reasons that come into play, for example, in the US attack on Iraq, although we are not able to precisely say which, if any, were the dominant reasons. In addition to the neoconservative ideological view advocated by the documentary that it is America’s destiny to bring democracy, by force if necessary, to the other countries of the world starting with Iraq, the following is a list of the many other reasons that have been speculated about: the control of Iraqi oil; the need to establish a strategic and long-term military base in the Middle East since Saudi Arabia was asking the US to leave its soil; Iraq as the first step in a successive series of invasions of other countries such as Iran and Syria so that eventually the US would control the entire region; to act in Israel’s interests and disarm an enemy of Israel; to project US power and show the world that the US had the power to invade any nation it wanted to, thus cowing any other nation’s ambitions to challenge the US in any way; to prevent Saddam Hussein from switching to the euro as a reserve currency for oil purchases, thus threatening US financial markets; an opportunity to test the new generation of weaponry in the US arsenal; to finish what was seen as unfinished business from the first Gulf war in 1991; to avenge the alleged attempt by Iraq on George H. W. Bush’s life; to enable George W. Bush to show his father that he was tougher than he was; because George W. Bush, despite his efforts to avoid actual military service himself, was enamored of the idea of being Commander in Chief and dearly wanted to be a ‘war president.’

The ‘weapons of mass destruction’ rationale is conspicuously missing from the above list. It was, I believe, always a fiction, promulgated as part of the politics of fear but never taken seriously even by those in the administration who advocated it. It is quite amazing that three years after the invasion of Iraq, we still don’t know the precise reasons for that fateful decision.

On the other side, al Qaida also had more practical stated reasons for what it did, that are not limited to a mere distaste for encroaching western decadent values. They were opposed to Israel’s treatment of the Palestinians and America’s complicity and support for those policies, they wanted US bases out of the middle east, they opposed the corrupt governments that were ruling many middle east countries and were being supported militarily and economically by the US, and they deplored the cruelty of US-led UN sanctions against Iraq. In other words, they were more opposed to what the US does than for what it is. It may also be that al Qaida is not quite as weak as the documentary portrays them to be.

But despite these shortcomings, the basic message of the documentary rings true: the threat to the US from terrorism has been vastly and deliberately overstated and is being used to ram through policies that would otherwise be rejected. As the documentary says:
“In an age when all the grand ideas have lost credibility, fear of a phantom enemy is all the politicians have left to maintain their power.”

A Guardian article titled The making of the terror myth says:

Adam Roberts, professor of international relations at Oxford, says that governments often believe struggles with terrorists “to be of absolute cosmic significance”, and that therefore “anything goes” when it comes to winning. The historian Linda Colley adds: “States and their rulers expect to monopolise violence, and that is why they react so virulently to terrorism.”

Whatever the causes for inflating the threat of terror, it is clear that we are the losers when we live in fear, and we need to counteract it. The BBC documentary is well worth watching for the way that it connects many different pieces of knowledge into a coherent story, told in an entertaining way.

POST SCRIPT: CSA: Confederate States of America

I recently heard about the above film, which is a fake documentary (along the lines of the classic This is Spinal Tap). This one seems like an alternate reality version of Ken Burns’ PBS series The Civil War. The website for the film describes what we can expect:

CSA: THE CONFEDERATE STATES OF AMERICA, through the eyes of a faux documentary, takes a look at an America where the South won the Civil War. Supposedly produced by a British broadcasting company, the feature film is presented as a production being shown, controversially, for the first time on television in the States.

Beginning with the British and French forces joining the battle with the Confederacy, thus assuring the defeat of the North at Gettysburg and ensuing battles, the South takes the battle northward and form one country out of the two. Lincoln attempts escape to Canada but is captured in blackface. This moment is captured in the clip of a silent film that might have been.

Through the use of other fabricated movie segments, old government information films, television commercials, newsbreaks, along with actual stock footage from our own history, a provocative and humorous story is told of a country, which, in many ways, frighteningly follows a parallel with our own.

The film will be screened at Shaker Square Cinemas starting March 17. You can see the trailer here.

The politics of terrorism-2: The origins of the neoconservatives

Yesterday, I discussed how the BBC documentary The rise of the politics of fear traced the origins of al Qaida to the influence of an Islamic scholar Syed Qutb.

Meanwhile, in the US, there was a cult brewing around a University of Chicago professor of political philosophy by the name of Leo Strauss (1899-1973). That ideology has now come to be called neoconservatism. Strauss and his neoconservative disciples (which include Paul Wolfowitz, Donald Rumsfeld, Dick Cheney, Irving Kristol, John Bolton, Richard Perle, William Kristol, Michael Ledeen, and others) were people who felt, like Qutb, that US society was decadent and losing its moral strength.

The Straussians felt that it was necessary for America to develop a positive image of itself, to see itself as the ultimate force for good in the world and its role as spreading its influence over the entire world, by force if necessary. They were fundamentally elitist, seeing people as the “masses” to be led, sometimes in spite of themselves. They believed that people needed “grand myths” in order to be persuaded to take serious actions like war, and they did not hesitate to manufacture them when necessary. For America, the “grand myth” they propagandized was the idea that Americans were a good and chosen people, under siege from dark forces from within and without, with a mission to convert the world to its own way of life.

In order to mobilize the people in this way, the neoconservative movement needed a grand enemy. It is hard to mobilize people for war and conquest unless they feel threatened by a darkly evil force. As Nazi Reichsmarshall and Luftwaffe-Chief under Hitler Hermann Goering famously said: “The people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of patriotism and exposing the country to danger. It works the same in any country.” (Nuremberg Diary by Gustave Gilbert, Farrar, Straus and Company, 1947, pp. 278-279.)

So grand enemies were created. At first, Communism and the Soviet Union served this purpose and was the enemy and from the 1950s onwards. The strength of the Soviet Union and the world Communist movement was consistently overestimated and its motives were consistently questioned, all leading to a feeling of paranoia at home. This paranoia enabled the US to create a huge military machine. But while the neoconservatives gained some influence in government, especially during the second Reagan administration, they did not actually seize the reins or power.

Ray McGovern, who served as a CIA analyst for 27 years from the administration of John F. Kennedy to that of George H. W. Bush and who, during the early 1980s, was one of the writers/editors of the President’s Daily Brief and briefed it one-on-one to the president’s most senior advisers, said that President George H. W. Bush kept the neoconservatives at arm’s length, because he knew how dangerous they were. In fact, the neoconservatives were known among political insiders as “the crazies.” McGovern writes:

During his term in office, George H. W. Bush, with the practical advice of his national security adviser Gen. Brent Scowcroft and Secretary of State James Baker, was able to keep “the crazies” at arms length, preventing them from getting the country into serious trouble. They were kept well below the level of “principal” — that is, below the level of secretary of state or defense.

The neoconservatives are so rabid in their expansionist ambitions that they even considered Henry Kissinger (one of the key architects of the vicious bombing of Vietnam and Cambodia, and who supported vicious dictators like General Suharto in Indonesia and General Pinochet in Chile as they murdered thousands) as too moderate and someone who had to be elbowed aside. “Crazies” seems like a good name for them.

The neoconservatives ambitions were finally fully realized in 2000 with the election of George W. Bush, where they now had a pliant president. Now they were in charge of the major policy decisions and set about fulfilling their dreams.

In order to consolidate their power and extend their influence over American politics, the neoconservatives made a tacit alliance with Christian fundamentalists, using the enticing idea of America being God’s favorite country, specially chosen to carry out its mission of being a civilizing force in the world. Conveniently, this tied in with the neoconservatives’ military and political goals of overthrowing other countries, especially those in the middle east.

This alliance also marked a shift in American religious fundamentalism, from a movement that had shunned involvement in electoral politics as being not worthy of people whose ultimate interest is life after death and whose goal is heaven, to one that became intensely involved with politics, seeking to have its moral perspectives become the law of the land.

This merging of fundamentalist Christian religion with a geopolitical neoconservative worldview is portrayed in the documentary as being in parallel with the Islamic fundamentalists seeking to embed their religious perspective into the political framework and impose their moral code as the laws of the Islamic countries.

Meanwhile, the Soviet Union ceased to exist as an equal superpower with the US. The documentary argues that the fundamental cause was internal, that the Soviet Union simply was a failed state, unable to deliver the goods to its people, but the actual collapse was partly triggered by its defeat in Afghanistan in 1989 at the hands of the US backed Islamic fundamentalists. The collapse of the Soviet Union, while hailed as a victory by both the US and the Islamic fundamentalists who battled them in Afghanistan, left the US neoconservatives without a grand evil enemy with which to frighten the US public and thus get them to agree to the their global strategy of spreading democracy by force all over the world.

But the rise of al Qaida provided that new enemy. The BBC documentary argues that while the attacks of September 11, 2001 were spectacular in the manner and level of destruction they caused, they were not a sign of wide support and deep strength. But by constantly harping on al Qaida as if it were some giant malevolent and dangerous foe, the current US administration, along with the Blair government in the UK, has managed to recreate a level of fear that exceeds perhaps even that which existed during the cold war, thus enabling them to mobilize public support to systematically attack countries like Iraq, and keep Iran and Syria in its target sights. In addition, it has enabled them to also undermine civil liberties at home, and create a climate where anything (indefinite and arbitrary detention, torture, murder, kidnapping) are considered acceptable.

The neoconservative movement sees as its goal the use of force to overthrow real and perceived enemies of the US. They see themselves as being the vanguard, the people who really understand the world and of America’s destiny to be its leader, and to use its military power to establish the new world order.

To be continued. . .