Why I blog

I reached a kind of landmark this week with this blog. I have been making entries since January 26th, posting one item each weekday, except for a three-week break in June. As a result I have now posted over 100 entries and consisting of over 100,000 words, longer than either of my two published books.

Why do I blog? Why does anyone blog? The Doonesbury comic strip of Sunday, July 3, 2005 fed into the stereotype of bloggers as self-important losers who cannot get real jobs as writers, and feed their ego by pretending that what they say has influence. The idea behind this kind of disparaging attitude is that if no one is willing to pay you to write, then what you have to say has no value.

Of course, there are a vast number of bloggers out there, with an equally vast number of reasons as to why they blog so any generalization is probably wrong. So I will reflect on why I blog. Some bloggers may share this view, others may have different reasons. So be it.

The first reason is the very fact that because of the blog, I have written the equivalent of a complete book in six months. Writing is not easy, especially starting to write on any given day. Having a blog enforces on me a kind of discipline that would not exist otherwise. Before I started this blog, I would let ideas swirl around in my head, without actually putting them down in concrete form. After awhile, I would forget about them, but be left with this nagging feeling of dissatisfaction that I should have explored the ideas further and written them down.

The second benefit of writing is that it forces you to clarify and sharpen your ideas. It is easy to delude yourself that you understand something when you have the idea only in your mind. Putting those ideas to paper (or screen) has the startling effect of revealing gaps in knowledge and weaknesses of logic and reasoning, thus forcing a re-evaluation of one’s ideas. So writing is not a one-way process from brain to screen/paper. It is a dialectic process. Writing reveals your ideas but also changes the way you think. As the writer E. M. Forster said “How can I know what I am thinking until I see what I say?” This is why writing is such an important part of the educational process and why I am so pleased that the new SAGES program places such emphasis on it.

Another benefit for me is that writing this blog has (I hope) helped me become a better writer, able to spot poor construction and word choice more quickly. Practice is an important part of writing and the blog provides me with that. Given that the blog is public and can (in principle) be read by anyone prevents me from posting careless or shoddy pieces. It forces me to take the time to repeatedly revise and polish, essential skills for writers.

When I started this blog, I had no idea what form it would take. Pretty soon, almost without thinking, it slipped into the form that I am most comfortable with, which is that of a short essay around a single topic each day. I initially feared that I would run out of ideas to write about within a few weeks but this has not happened. In fact what happens is what all writers intuitively know but keep forgetting, which is that the very act of writing acts as a spur for new ideas, new directions to explore.

As I write, new topics keep coming into my mind, which I store away for future use. The ideas swirl around in my head as I am doing other things (like driving and chores), and much of the writing takes place in my mind during those times as well. The well of ideas to write about does not show any signs of going dry, although it does take time to get the items ready for posting, and that is my biggest constraint. Researching those topics so that I go beyond superficial “off the top of my head” comments and have something useful to say about them has been very educational for me.

Since I have imposed on myself the goal of writing an essay for each weekday, this has enabled me to essentially write the first draft (which is the hardest part of writing, for me at least) of many topics that may subsequently become articles (or even books) submitted for publication. If I do decide to expand on some of the blog item for publication, that process should be easier since I have done much of the preliminary research, organization, and writing already.

All these benefits have accrued to me, the writer, and this is no accident. I think most writing benefits the author most, for all the reasons given above. But any writer also hopes that the reader benefits in some way as well, though that is hard for the author to judge.

I remember when I was younger, I wanted to “be a writer” but never actually wrote anything, at least anything worthwhile. Everything I wrote seemed contrived and imitative. I then read a comment by someone who said that there is a big difference between those who want to be writers and those who want to write. The former are just enamored with idea of getting published, of being successful authors and seeing their name in print. The latter feel that they have something to say that they have to get out of their system. I realized then that I belonged to the former class, which I why I had never actually written anything of value. With that realization, I stopped thinking of myself as a writer and did not do any writing other than the minimum required for my work. It is only within the last ten years or so that I feel that I have moved into the latter category, feeling a compulsion to write for its own sake. This blog has given me a regular outlet for that impulse.

I would never have written so much without having this blog. I would recommend that others who feel like they have to write also start their own. Do not worry about whether anyone will read it or whether they will like it. Write because you feel you have something to say. Even if you are the only reader of your own writing, you will have learned a lot from the process.

POST SCRIPT

Paul Krugman is an economist at Princeton University and is a member of the reality-based community. His July 15, 2005 op-ed in the New York Times shows how far politics has moved away from this kind of world and into one in which facts are seen as almost irrelevant.

Thanks to Richard Hake for the following quote by F.M. Cornford, Microcosmographia Academica – Being A Guide for the Young Academic Politician (Bowes & Bowes, Cambridge, 4th ed., 1949 first published in 1908), which might well have been addressed to Krugman and other members of the reality-based community, although it was written over a century ago:

You think (do you not?) that you have only to state a reasonable case, and people must listen to reason and act upon it at once. It is just this conviction that makes you so unpleasant….are you not aware that conviction has never been produced by an appeal to reason which only makes people uncomfortable? If you want to move them, you must address your arguments to prejudice and the political motive….

“I know this is not politically correct but….”

One of the advantages of being older is that sometimes you can personally witness how language evolves and changes, and how words and phrases undergo changes and sometimes outright reversals of meaning.

One of the interesting evolutions is that of the phrase “politically correct.” It was originally used as a kind of scornful in-joke within Marxist political groups to sneer at those members who seemed to have an excessive concern with political orthodoxy and who seemed to be more concerned with vocabulary than with the substance of arguments and actions.

But later it became used against those who were trying to use language as a vehicle for social change by making it more nuanced and inclusive and less hurtful, judgmental, or discriminatory. Such people advocated using “disabled” instead of “crippled” or “mentally ill” instead of “crazy,” or “hearing impaired” instead of “deaf” and so on in an effort to remove the stigma under which those groups had traditionally suffered. Those who felt such efforts had been carried to an extreme disparaged those efforts as trying to be “politically correct.”

The most recent development has been to shift the emphasis from sneering at the careful choosing of words to sneering at the ideas and sentiments behind those words. The phrase has started being used pre-emptively, to shield people from the negative repercussions of stating views that otherwise may be offensive or antiquated. This usage usually begins by saying “I know this is not politically correct but….” and then finishes up by making a statement that would normally provoke quick opposition. So you can now find people saying “I know this is not politically correct but perhaps women are inferior to men at mathematics and science” or “I know this is not politically correct but perhaps poor people are poor because they have less natural abilities” or “I know this is not politically correct but perhaps blacks are less capable than whites at academics.” The opening preamble is not only designed to make such statements acceptable, the speaker can even claim the mantle of being daring and brave, an outspoken and even heroic bearer of unpopular or unpalatable truths.

Sentiments that would normally would be considered discriminatory, biased, and outright offensive if uttered without any supporting evidence are protected from criticism by this preamble. It is then the person who challenges this view who is put on the defensive, as if he or she was some prig who unthinkingly spouts an orthodox view.

As Fintan O’Toole of The Irish Times pithily puts it:

We have now reached the point where every goon with a grievance, every bitter bigot, merely has to place the prefix, “I know this is not politically correct but…..'” in front of the usual string of insults in order to be not just safe from criticism but actually a card, a lad, even a hero. Conversely, to talk about poverty and inequality, to draw attention to the reality that discrimination and injustice are still facts of life, is to commit the new sin of political correctness……… Anti-PC has become the latest cover for creeps. It is a godsend for every sort of curmudgeon or crank, from the fascistic to the merely smug.

Hate blacks? Attack positive discrimination – everyone will know the codes. Want to keep Europe white? Attack multiculturalism. Fed up with the girlies making noise? Tired of listening to whining about unemployment when your personal economy is booming? Haul out political correctness and you don’t even have to say what’s on your mind.

Even marketers are cashing in on this anti-PC fad, as illustrated by this cartoon.

Perhaps it is my physics training, but I tend to work from the principle that in the absence of evidence to the contrary, we should assume that things are equal. For example, physicists assume that all electrons are identical. We don’t really know this for a fact, since it is impossible to compare all electrons. The statement “all electrons are identical” is a kind of default position and, in the absence of evidence to the contrary, does not need to be supported by positive evidence.

But the statement “some electrons are heavier than others” is going counter to the default position and definitely needs supporting evidence to be taken seriously. Saying “I know this is not politically correct but I think some electrons are heavier than others” does not make it any more credible.

The same should hold for statements that deal with people, because I would like to think that the default position is that people are (on average) pretty much the same in their basic needs, desires, feelings, hopes, and dreams.

POST SCRIPT 1

I love movies but am not a big fan of Tom Cruise’s films. I was surprised, though, by the way people went after him (and his Church of Scientology) for his recent comments on psychiatry and mental illness. I was first bemused that this topic arose in the interview with him, and then by the subsequent reaction where it was as if people felt that he had no right to his views on this subject. Even if people disagree with him, why do they get so upset? Why do people even care what his views are about psychiatry? I was thinking of writing something on this incident but then came across this article which covers some of the ground I would have, and also raises the problematic role that the big pharmaceutical companies have played in this issue of treating illnesses with drugs.

POST SCRIPT 2

The invaluable website Crooks and Liars website has posted a funny The Daily Show video clip about the Valerie Plame affair. Since I do not have cable and don’t watch too much TV anyway, I depend on Crooks and Liars and onegoodmove to alert me to TV segments that I otherwise would miss. These two sites are well worth bookmarking.

Should professors reveal their views?

During the last academic year, UCITE organized a faculty seminar on whether, and how much, of their own views professors should reveal to the students in their classes.

One faculty member recalled one of her own teachers admiringly. She said that he had guided the discussions in her college classes very skillfully and in such a way that no one knew what his own views were on the (often controversial) topics they discussed. She felt that his avoidance on revealing his own views led to a greater willingness on the part of students to express their own, since they were not agreeing or disagreeing with the authority figure. She felt that his model was one that others should follow.

Underlying this model is the belief that students may fear that going against the views of the professor might result in them being penalized or that agreeing with the professor might be viewed as an attempt at ingratiation to get better grades.

I am not convinced by this argument, both on a practical level and on principle, but am open to persuasion.

As a purely practical matter, I am not sure how many of us have the skills to pull off what this admired professor did. It seems to me that it would be enormously difficult to spend a whole semester discussing things with a group of people without revealing one’s own position on the topics. It is hard to keep aloof from the discussion if one is intensely interested in the topic. As readers of this blog know, I have opinions on a lot of things and if such topics come up for discussion, I am not sure that I have the ability to successfully conceal my views from students. So many of us will betray ourselves, by word or tone or nuances, despite our best efforts at concealment.

But I am also not convinced that this is a good idea even in principle, and I’d like to put out my concerns and get some feedback, since I know that some of the readers of this blog are either currently students or have recently been students.

One concern about hiding my own views is precisely that the act of hiding means that I am behaving artificially. After all, I assume that students know that academics tend to have strong views on things, and they will assume that I am no exception. Those students who speak their minds irrespective of the instructor’s views won’t care whether I reveal my views or not, or whether they agree with me or not. But for those students for whom my views are pertinent, isn’t it better for them to know exactly where I stand so that they can tailor their comments appropriately and accurately, rather than trying to play guessing games and risk being wrong?

Another concern that I have arises from my personal view that the purpose of discussions is not to argue or change people’s views on anything but for people to better understand why they believe whatever they believe. And one of the best ways to achieve such understanding is to hear the basis for other people’s beliefs. By probing with questions the reasoning of other people, and by having others ask you questions about your own beliefs, all of the participants in a discussion obtain deeper understanding. In the course of such discussions, some of our views might change but that is an incidental byproduct of discussions, not the goal.

Seen in this light, I see my role as a teacher as modeling this kind of behavior, and this requires me to reveal my views, to demonstrate how I use evidence and argument to arrive at my conclusions. I feel (hope?) that students benefit from hearing the views of someone who has perhaps, simply by virtue of being older, thought about these things for a longer time than they have, even if they do not agree with my conclusions. To play the role of just a discussion monitor and not share my views seems to defeat one of the purposes of my being in the classroom.

The fly in the ointment (as always) is the issue of grades. I (of course) think that I will not think negatively of someone who holds views opposed to mine and it will not affect their grades. But that is easy for me to say since I am not the one being graded. Students may not be that sanguine about my objectivity, and worry about how I view them if they should disagree with me.

When I raised this topic briefly with my own class last year, they all seemed to come down in favor of professors NOT hiding their personal views. But I am curious as to what readers of this blog think.

Do you think professors should reveal their views or keep them hidden?

POST SCRIPT 1

The website Crooks and Liars has posted a funny video clip from the Daily Show that addresses how high levels of fear are generated in America, a topic that I blogged about earlier.

This article by John Nichols compares the British response to the tragedy with the way the American media tried to frame it.

POST SCRIPT 2

Also, for those of you struggling to keep up with the complicated set of issues involved with the Valerie Plame-Joseph Wilson-Robert Novak-Judith Miller-Matthew Cooper-confidential journalistic sources issue, there is an excellent article by Robert Kuttner that (I hope) clears things up.

Catholic Church reversing course on evolution?

It was only on May 19 that I compared religious reaction to two major scientific revolutions, those identified with Copernicus and Darwin, and showed that in each case religious objections to the new theories only arose more than a half-century after the theories were published, and then began with Protestants, rather than the Catholic Church. The religious opposition may have been slow in coming because it took some time for the theological implications of the new cosmology to be realized. In fact, the religious opposition was rising just about the time that the scientific debates were ending, and the scientific community was coalescing behind the new theories as more and more supporting data were coming in.
[Read more…]

Public and private grief

One of the things that strikes me is America seems to have a fascination with memorials and ceremonies honoring the dead.

There are memorials for the various major wars, there is a memorial built for the Oklahoma City bombing, for the Lockerbie disaster, and there is the present bitter argument over the proposed memorial at the site of the World Trade Center. But there is more to it than physical memorials. There are also memorial ceremonies held on the anniversaries of these events, complete with flags, prayers, political leaders, speeches, and media coverage.

Has it always been like this or is this a relatively new phenomenon? I ask because this extended public and organized brooding on tragedy seems strange to me. In my experience growing up in Sri Lanka, after a major disaster, people tend to quickly clear up the mess and move on. There are some memorials, but they tend to be for dead political figures and are built by their immediate families or their political supporters. The idea of public memorializing is not widespread.

Of course, the family and friends of people lost to tragedies feel grief, and this is a universal phenomenon, transcending national and cultural boundaries. It is perfectly natural for such people to feel a sense of sadness and loss when an anniversary date comes around, reminding them of those who are no longer part of their lives. The personal columns of newspaper in Sri Lanka are filled, like they are here, with the sad stories of loss, some from many, many years ago.

But I wonder how much of this memorializing and solemnity is widespread among people who do not suffer a direct personal loss. At each anniversary of 9/11, for example, the media solemnly report that the whole nation ‘paused in grief’ or something like that. But among the people I know and work with, no one talks about the events on the anniversaries. Are we a particularly callous group of people, or is my experience shared by others? Of course, people may reflect on the events on those days but how much of that is media inspired, because the newspapers and radio and TV keep talking about it? If the media ignored these anniversaries, would ordinary people give these anniversaries more than a passing thought? How many people feel a sense of grief or sorrow on the anniversaries of disasters that did not affect them personally?

In Sri Lanka, the recent tsunami killed about 40,000 people in a matter of minutes. It is the worst single disaster in country that has known a lot of tragedies, both natural and human-caused. Like disasters everywhere, it brought out the best in people as they overlooked ethnic, religious, and linguistic barriers and joined in the massive relief efforts, helping total strangers using whatever means at their disposal.

And yet, on my recent visit, I did not hear of any plans for a public tsunami memorial. I am fairly certain that if anyone proposed it, people would (I think rightly) argue that the money would be better spent on relief for the victims of the disaster rather than on something symbolic.

This made me wonder about the following: while private grief is a universal emotion, I wonder if public grief is a luxury that only the developed world can afford to indulge in. In countries where the struggle of day-to-day living takes most of one’s energy, is grief a precious commodity that people can expend only on the loss of their nearest and dearest, except in the immediate aftermath of a major tragedy?

Undermining faith in the judiciary

I have always believed that people tend to behave better than one might expect them to when placed in positions of trust where high standards of behavior are expected of them. One particular kind of occupation exemplifies my belief, and that is judges.

The public expects members of the judiciary to act according to higher standards than the rest of us and I think that this expectation generally tends to be fulfilled. I believe that whenever someone enters a profession that has a noble calling, the very nature of the office tends to produce an ennobling effect.

This is particularly so in the case of the higher levels of judiciary. A person who becomes a Supreme Court judge, for example, is well aware that he or she is occupying a select position of great trust and responsibility, and I cannot help but believe that this will rub off on that person, making him or her strive to be worthy of that trust. This does not make them superhuman. They are still subject to normal human weaknesses and failures. They may still make wrong decisions. But I think that in general they behave better by virtue of occupying those positions than they might otherwise, and try to live up to the standards expected of them.

But this works only if the judges feel they are entering a noble calling and that they are expected to live up to it. If the prestige and the dignity of the judiciary is undermined by treating judges as if they were just political hacks, then they will behave accordingly. This is why I view with concern attempts by people, especially political leaders, to undermine faith in the judiciary. There are two ways in which this happens.

The first way is to personally attack judges whenever a decision does not go the way they wanted it to go. This tendency has accelerated in recent years in the US, as can be seen by the ugly venom heaped on the Florida judge in the Terry Schiavo case. We have seen similar invective hurled at judges when they have ruled in ways that people have not liked on hot button issues, ranging from First Amendment cases involving religion to flag burning to abortion. The judges have been decried as being “judicial activists” and worse.

The second way to undermine the judiciary is by clearly seeking to appoint judges precisely because they have a particular political agenda, and not because they display the intellect and independence of thought that a good judge should have.

Sri Lanka again offers an unfortunate precedent for this degeneration. It used to have a fairly independent judiciary whose members were nominated by a Judicial Services Commission, whose members were at least one step removed from direct political influence. It was expected that the JSC would nominate people who had serious credentials and hence there was the belief among the general public that judges were, on the whole, impartial although individual judges here and there may have been suspect. But again beginning in the 1970s, the government started to severely criticize judges who ruled against the government, even sending mobs to demonstrate in front of judges’ homes and try to intimidate them.

After that, it was only a short step to create a more overtly political process for the selection of judges, in order to ensure that decisions would be more acceptable to the government. Despite this, the ennobling effect that I spoke of earlier helped to make the judges better that one might expect, but it was a losing battle. When I was in Sri Lanka last month, I was told that faith in the impartiality of the judiciary had been badly undermined by the cumulative effects over the years of such negative policies.

This is a real pity because this kind of credibility, once lost, is hard to regain. Undermining the judiciary in this way a dangerous trend for any nation that values the rule of law. When you undermine faith in the impartiality and honesty of judges, you are just one step away from mob rule.

As I said above, the US seems to have already started down this unfortunate road. The upcoming battle for the Supreme Court vacancy created by the retirement of Sandra Day O’Connor will provide a good indication of the shape of things to come. It will be unfortunate if people focus on the nominee’s views on specific issues. I would much rather see an examination of whether the person shows a scholarly mind, whether he or she has shown an independence of thought, whether the person bases judgments and reasoning on evidence and on universal principles of justice and the constitution, whether the person shows compassion and understanding of the human condition in all its complexity, whether he or she listens to, understands, and appreciates the arguments of even those whom he or she rules against, and appreciates that the judiciary is the ultimate safeguard of rights and liberties for individuals against the massive power of governments and corporations. In other words, does the person have what we might identify as a “judicial temperament.”

If you speak with lawyers, they can often identify those judges whom they respect, even when those judges rule against the lawyers. Identifying what makes judges respected despite their specific opinions on specific cases is what the discussions about selecting a new Supreme Court justice should be all about.

But if the discussion ends up being (as I fear they will) about the nominee’s views on the Ten Commandments, abortion, gay marriage, flag burning, and the like, then we will be continuing to cheapen the whole Supreme Court.

The nature of the nominee and the discussion around it will tell us a lot about how we will be viewing the judiciary in the days, and perhaps generations, to come.

POST SCRIPT

Are you interested in having thoughtful discussions on deep topics? Consider attending the Socratease discussions. These are open to anyone and held every second Tuesday of each month (next one is on June 12) at Night Town restaurant on Cedar Road in Cleveland Heights (in the Cedar/Coventry area). The discussions are from 7:30-9:30pm. You are under no obligation to order food or drink from the restaurant.

The format is that all the people present who wish to can suggest a topic for the night’s discussion, then a vote is taken, and the winning topic becomes the focus for that night.

Should one use raw political power or govern by consensus?

The second parallelism I saw between political developments in Sri Lanka and in the US has been the breakdown in the usual rules of behavior regarding building consensus.

To some extent, politics in both Sri Lanka and the US are insider’s games. The people in the leadership of the two main parties tend to be members of the same elites, representing two wings of the same political and social class, and the same interests. Hence they tend to adopt ‘rules of the game’ that are usually civil and polite. One benefit of this civility is that the interests of the minority party at any given time is not completely ignored because the ruling party understands that it might be in the minority after the next election. The disadvantage is that the two parties do not challenge the basic status quo and the ruling elites, since they are both members of that same class.

Looking only at the positive side, the protection of minority interests in consensus-style governance has the effect of providing some political continuity, especially in parliamentary systems, where the legislature is a collection of individuals, each elected to represent a given geographic area. Under such a system, it is possible for a political party with just a bare majority of voters to have an overwhelmingly large majority in the legislature. In multiparty systems, it is even possible for a party that does not have even a majority of the popular vote to have a majority in the legislature. This can lead to wide swings in legislative majorities after each election (without a corresponding swing in actual votes or voter sentiment), so having rules that enable people to function both in and out of power becomes important

In Sri Lanka, this sense of following unwritten rules that benefited both sides existed until around 1970 and then started falling apart as each of the major parties started using raw political power to ram through policies that tended to ignore the interests of the opposing party. In 1970, one party (the SLFP) got a huge (more than two-thirds) majority and used that majority to change the constitution using a device of somewhat dubious legality. They even used their majority to unilaterally extend the life of the parliament by two years, so that instead of elections falling due in 1975, they were next held in 1977.

But in 1977, there was another big swing of the political pendulum and the opposition party (UNP) came to power with a huge (also more than two-thirds) majority. They too enacted sweeping constitutional changes, a chief one being replacing the old system of individual seats by a proportional representation system. Since the proportional representation system tends to provide a parliament that reflects more accurately the percentage of votes a party receives, it also had the effect of ensuring that future parliaments would be unlikely to have the two-thirds parliamentary majorities needed for undoing the constitutional changes that the UNP had put in place.

But since the huge majority that the UNP obtained in 1977 was still in place until the next election, the UNP was free to make constitutional changes freely, and this they proceeded to do, sometimes in the most self-serving way, making a mockery of democratic principles. For example, the government extended the life of the parliament (with its huge majority) by using a simple-majority referendum.

Another example of using raw power was how political defections were handled. It used to be that governments could be toppled if enough members of parliament switched allegiance from the government party to the opposition. Members had done this in the past for a variety of reasons, for political principles, as a mark of protest, or merely for personal ambition or other similarly ignoble reasons.
The post-1977 constitution eliminated this threat to the government’s stability by saying that if a member switched parties, that person automatically lost his or her seat and was replaced by someone from the party being vacated, thus maintaining the party status quo. This was a huge disincentive for any member to switch, since nothing would be gained. But then some members of the opposition said they wanted to switch to the government side. Since the government wanted to encourage this defection for propaganda purposes, the government used its huge majority to make a constitutional change that said that people could switch sides without losing their seats provided parliament voted to approve each such switch. This guaranteed that only party switching that favored the government could place, since only the government side had the votes to approve the switch.

This kind of frequent and ad-hoc changes to the constitution to serve narrow partisan ends resulted in a devaluation of the respect that a constitution should have. At one time, a joke made the rounds in Sri Lanka that had someone entering a bookstore and asking to purchase a copy of the constitution and being told that “Sorry, we do not sell periodicals.”

While I have observed this trend in Sri Lanka politics over the last three decades or so, its parallel development in the US is much more recent. The fight over the use of the filibuster, the attempts to enshrine a flag-burning amendment in the constitution, the battles over judicial nominees, the attempt to breach the establishment clause of the first amendment, are signs that using raw political power to gain short-term goals is gaining ground here too. The argument seems to be that political power is there to be used in whatever way possible.

There are two views on this trend. Some disapprove, saying that achieving consensus government is preferable, since that avoids nasty partisan battles and wild swings in policies. They appeal for ‘bipartisanship.’ Others argue that the problem with consensus politics and bipartisanship is that the politics of the most reactionary elements wins out, since bipartisanship usually results in the most intransigent person or party getting his or her own way. Also bipartisanship can be a symptom that the two major parties are in fact colluding to protect their common interests at the expense of the excluded classes. Such people argue that at least with using raw political power, there is a chance that your side will someday be in the ascendant and able to use it to pursue politics that you like.

This is a tricky question to which there is no simple answer, at least one that I can see.

“The Bible says…”

One of the things I benefited most from once being an ordained lay preacher was having to study the Bible in a fairly formal way. The Bible is a fascinating book, and studying it in some depth reveals treasures that might be missed by those who just pick outs bits here and there.

For example, I discovered that some of the books of the so-called “minor” prophets of the Old Testament (Jonah and Amos were my particular favorites), when taught by scholars, make for great reading and are full of insights into the human condition. The Bible also has passages that astound you with their poetic beauty and precision of thought. Take, for example, this verse from Ecclesiastes (9:11) that addresses the seeming disconnect between ability and reward, and the general randomness of life:

I returned and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth to them all.

And we are constantly reminded of how indebted we are to two sources (the Bible and Shakespeare) for so many of the phrases that we use in everyday language.

But another benefit of studying the Bible is that I am immediately on the alert when someone says “The Bible says X” in order to support some position. My first response is “Where exactly does it say it?” Quite often, they cannot quote a supporting verse and you realize that they simply think the Bible should say that, because they strongly believe it. It has become part of folklore.

So when someone says “The Bible says X”, always ask for supporting evidence.

The second point is that even when such people actually have a quote to back up their assertion, you can often point to other quotes that contradict their position or puts it in a different light or context. This is because the Bible says a lot of things. It is an immense book with many authors, written over a long span of time, in more than one language, and from the perspective of many different cultures. There is also the fact that (as some of commenters to this blog have pointed out previously) the translations of ancient Hebrew and Greek and other texts into English involves the introduction of some unavoidable ambiguities. The Bible is by no means a clear statement of beliefs and values that can be easily inserted into modern day political and ideological battles, and it can be claimed to be so only by deliberately cherry-picking bits and pieces to serve an agenda. When, in the Merchant of Venice (act 1, sc. 3), Shakespeare has Antonio saying “The devil can cite Scripture for his purpose,” he is right. The Bible can be quoted to support a vast range of positions, some of them truly bizarre, so arguing on the basis of Biblical texts, taken literally, is rarely conclusive.

I remember one time some years ago when Jehovah’s Witnesses came to my house to sell their magazine and to try and convert me. I am usually friendly to them, since I admire their devotion to their cause and they are invariably polite (a quality that I like), but I try to tell them as gently as possible I am not interested. But one of them tried to pique my interest by pointing to the feature article in that month’s magazine, which argued that AIDS was God’s punishment on homosexuals. This definitely got my attention as I happen to think that that is one of the sickest ideas ever conceived, and thus got drawn into an argument. They produced the usual Biblical quotes against homosexuality. I argued that one had to interpret the Bible in the context of when it was written and the mores that existed at that time, and that the Bible’s message could change with time.

The Witness flatly rejected my contention, saying that no re-interpretation was possible. The Bible’s message was universal in scope and unchanging with time. I then mentioned Paul’s letter to Philemon, in which he seems to have urged Philemon’s runaway slave to accept his position and return to his master. Did that mean, I asked, that slavery was acceptable? The Witness (who was black, which was why I had chosen this particular story) was taken aback and said that we had to interpret that story in a sophisticated way in order to understand its real message. I then asked why we should do that for slavery and not for homosexuality, and of course, there is really no answer to that. In fact, the Bible asserts that God does and condones the most appalling things, actions that are truly monstrous. There is no way to resurrect a belief in a loving God without some serious textual criticism, re-interpretation, and re-evaluation of these passages.

The third thing you often find about people who glibly assert “The Bible says…” is that they rarely quote from Jesus’ actual words, which is odd if you call yourself a Christian. For Christians, Christ’s teachings are supposed to be the final word, and yet many Biblical fundamentalists seem to prefer to quote the Old Testament, the letters of Paul, or Revelations. Could this be because Jesus preached a far more tolerant message than many who now confidently claim to speak in his name? Jesus was constantly hanging out with those whom we would consider low-lifes, prostitutes and the like, and was not judgmental about them. He was more likely to be critical of those who sat in judgment on others.

For example, the Plain Dealer in its issue of Saturday, July 2, 2005 (page E3) had one of those inane features where the responses of anonymous people to some question. (What is the point of such features? To let random people vent their spleen?) The question this time was: “Would you want your religious leader to bless same-sex unions?” One respondent said no because “the Bible says to speak out against sin, and homosexual relations are a sin (1 Corinthians 6:9…I could never understand how one could be considered a Christian and be an unrepenting homosexual.” To this person’s credit, he/she gave a citation to one of Paul’s letters. (Paul is the go-to guy in the New Testament if one is looking for support for intolerant views.) But if you look up the passage, this is what is says in full (in the authoritative [UPDATE: After the comment by Mark, I realize that I have been guilty of sloppy language and should have used the word ‘familiar’ instead of ‘authoritative’ since I am not really a competent judge of the latter] King James version): “Know ye not that the unrighteous shall not inherit the kingdom of God? Be not deceived: neither fornicators, nor idolaters, nor adulterers, nor effeminate, nor abusers of themselves with mankind, Nor thieves, nor covetous, nor drunkards, nor revilers, nor extortioners, shall inherit the kingdom of God..” So rather than being a particularly outrageous sin, homosexuality is not even mentioned but being effeminate is said to be evil. In some translations, ‘effeminate’ is replaced with ‘homosexual’, but the two words are clearly not equivalent. (The Living Bible, which is a modern (1971), much looser, translation with an evangelical tilt, gives the list as: idol worshipers, adulterers, male prostitutes, homosexuals, thieves, greedy people, drunkards, abusers, and swindlers.” Note how “fornicators” have been dropped and how “effeminate” and “abusers of themselves with mankind” have been changed, showing significant distortions in meaning. For this reason, serious Biblical scholars do not recommend its use.)

Whatever one’s religious beliefs, one can learn a lot from the Bible. But what you learn may not quite be what you expect.

POST SCRIPT

Steve Perry, the Editor of the Minneapolis/St. Paul weekly newspaper City Pages, is to my mind, one of the shrewdest observers of the domestic national political scene. Last week’s Free Times had a cover story by him (Gagging Dr. Dean) that explains why the Democratic Party seems so reluctant to fight for the kinds of policies that its rank and file might want. For those of you who missed the article, you can read it here.

In an earlier essay written in 2002 titled Spank the Donkey, Perry is more cynical and argues that the Democratic Party may be beyond salvaging, so beholden has it become to its big-money contributors.

Politics and religion-3

There is no doubt that people’s religious beliefs often have political implications. For example, if your religious beliefs require you to live according to certain principles, and the actions resulting from those principles bring you into conflict with the law, then one has an obligation to work to change the laws. Typically this is done by advocating and lobbying for specific legislation or, in the case of civil disobedience campaigns, by defying the law and taking the consequences in order to show the unjustness of the laws and thus sway public opinion. The latter strategy was used with great effect by Mahatma Gandhi and Martin Luther King. While Gandhi was secular, King was overtly religious and made no secret of the fact that he was driven at least partly by his religious convictions.
[Read more…]

Politics and religion-2

As I said before, the significant beginnings of Buddhist religious involvement in Sri Lankan politics began with the 1956 stunning landslide parliamentary victory by an underdog candidate who ran on a platform that shrewdly mixed nationalist politics with an appeal to the ethnic-religious Sinhala-Buddhist population that they would receive favorable treatment under his government.

While this resulted in a short-term benefit for the new Prime Minister and his government, they found it hard to meet the raised expectations of their aroused base and pretty soon things started falling apart. The most serious was the failure of the government to implement a deal to meet the needs of the minority Tamil population, because of the opposition from their more extreme Sinhala-Buddhist supporters, who argued then (and have done so ever since) that almost any concession to Tamil interests was a sell-out of the nation’s Sinhala-Buddhist heritage. This was followed in 1958 by a pogrom aimed at Tamils that resulted in many deaths, injuries, and displacements, and in 1959 the Prime Minister himself was assassinated by a Buddhist monk in a plot led by some Buddhist clergy, people who had once been his supporters.

But despite this seriously negative outcome, the die had been cast as far as political appeals to ethnic-religious chauvinist elements were concerned. Other politicians noted how successful such appeals had been in garnering votes and immediately almost all members of political parties started falling over themselves in trying to pander to the majority religion. Politicians who had not been known for their religious devotion ‘got religion’ in a big way.

This pandering took the form of public piety, making sure that everyone was aware of how religiously observant they were. They would make public shows of going to Buddhist temples, paying courtesy calls on the major Buddhist clerics, incorporating religious themes into speeches, etc. (Does this seem familiar in the US context?) Even some of the members of Marxist parties started doing these things, such was the pressure to conform to this new standard.

Governments started public funding of temples and the clergy, going so far as to provide temples with Mercedes-Benz limousines to transport the clerics. The irony is that Buddhism itself is a religion in the ascetic tradition, with the Buddha himself (the former prince Gautama) rejecting all worldly goods and attachments, seeing such things as barriers to attaining enlightenment and nirvana.

Perhaps the best example of the extent to which this kind of religious pandering led to absurd policies came in the way the calendar was changed. (You are going to find the following story hard to believe but it is true. I lived though this.) The Buddhist calendar is based on the lunar cycle. The full moon has always had religious significance for Buddhists because it is believed that the Buddha was born, attained enlightenment, and died on a full moon day. So one government, in its desire to pander to religious sentiment, decided that the weekly calendar that had the weekend on Saturday and Sunday was too Christian-centered and that what was needed was a Buddhist-centered calendar that was built around the lunar cycle. So the full moon, quarter moon, new moon and three-quarter moon days were made holidays (called ‘poya’ days) as were the days just preceding them (called the ‘pre-poya’ days). Thus the pre-poya and poya days were the new weekends, replacing Saturday and Sunday.

Since these days need not coincide with Saturday and Sunday, a new system had to be devised to keep track of weekdays. So the weekdays were called P1, P2, P3, P4, and P5, standing for the ‘first day after poya’, ‘second day after poya’, etc. The catch is that since the lunar cycle is around 29 days, every fourth week or so (there was no definite pattern), you would have an extra workday in the week, which was called P6. Keeping track of these things and scheduling future events became a nightmare. Every time the week with the extra day kicked in, authorities would have to decide which of the five weekday schedules would have to be followed on the extra day.

It also made interactions with the rest of the world problematic, because the periodic occurrence of the extra-long week meant that the poya days did not have a fixed relationship to the standard days of the week. Since the rest of the world worked on the standard week, people outside Sri Lanka never knew when we were off on our weekends, disrupting international trade.

This was the system that existed when I was in middle and high school, and it was confusing for everyone, to put it mildly. It is surprising that it lasted as long as it did (many years), but it finally collapsed because everyone just got sick of it, and Sri Lanka reverted to the standard system, without any seeming religious objections.. As a sop to the religious wing, the full moon day every month was retained as a religious holiday so that Sri Lankans now have probably the most public holidays of any nation.

The point of this story is that once political parties start competing for religious support, there seems to be no end to the kinds of ridiculous things that can ensue. The messing around with the weekly calendar was confusing and ridiculous but relatively benign. More serious is when these actions result in one group feeling that it is only their religious sentiment that matters when it comes to forming public policy.

In the US, there are already signs of the increased public piety among elected officials. They talk about their religion and their visits to churches are publicized. Religious spokespersons are invited to the White House. “Prayer breakfasts” are held routinely by elected officials. We have official “days of prayer.”

It also seems to have become routine for Presidents and other politicians to end their major speeches with the phrase “God Bless America.” This is relatively new. When President Kennedy spoke to the nation on the eve of the Cuban Missile Crisis, perhaps the closest the world has come to all-out nuclear war, he ended his speech with a simple “Thank you and good night.” This was the same ending used by President Nixon when in 1972 he spoke to nation about his plans for the war in Vietnam. Although Presidents up to and including the overtly religious Carter occasionally inserted references to God in their speeches, it was reserved for special rhetorical flourishes and it did not become a standard ending tagline for speeches until with President Reagan. It was his successor George H. W. Bush (the current president’s father) who really went over the top, ending his speeches with almost pep-rally like appeals for God’s blessings. (See the article by Jonathan Rauch in the National Journal for a review of God’s appearance in presidential speeches. Rauch also makes the astounding claim that seven states even prohibit atheists from holding public office! His article was written in 1999.)

When politicians feel the need for public statements of piety, then I think we are going down a dangerous road. There is a gripping ten-minute video clip from the TV program The West Wing that captures this issue very well. The clip is must see TV. In it the senator portrayed by Alan Alda, under pressure to make a show of his religion, makes this comment to the press “If you demand expressions of religious faith from politicians, you are just begging to be lied to…And it will be one of the easiest lies to ever have to tell to get your votes.” (To see the video, just click on the still of Alan Alda. You need Quicktime video to play it, and that is a free download if you don’t already have it.)

I have always believed that the secular state is the most just state. It also would fit (I think) with John Rawl’s ‘justice as fairness’ model for society. Many people think that ‘secular’ means atheist but that is wrong. A secular state means that laws must be neutral with respect to any or all religions or the lack of it. The government cannot promote any religion or deny people their right to practice the religion of their choice. The establishment clause of the First Amendment to the US constitution pretty much says it best: “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”

It would be a pity to undermine such a good idea.