The questions not asked II – UN resolutions

It’s time to play another game of The questions not asked. This is where we examine the reporting of some news event and try and identify the obvious questions that should have been posed by the media, or the context that should have been provided to better understand the event, but wasn’t.

Today’s example is taken from a speech given by George W. Bush on March 8, 2005 and reported in the Houston Chronicle.

“The time has come for Syria to fully implement Security Council Resolution 1559,” Bush told a largely military audience at the National Defense University. “All Syrian military forces and intelligence personnel must withdraw before the Lebanese elections for those elections to be free and fair.”

Bush, in a speech touting progress toward democracy in the broader Middle East, did not say what might follow failure to comply.

At the White House, spokesman Scott McClellan also left the question open. “If they don’t follow through on their international obligations, then, obviously, you have to look at what the next steps are,” McClellan said.

So what questions were not posed? What context was not provided?

One immediate answer is to compare the situations in Lebanon and Iraq. How can Bush say that the Lebanese elections cannot be free and fair because of the presence of 14,000 Syrian troops there, when ten times that many US troops were present in Iraq during that election in January, but those elections were praised?

But that question was not asked, the context not provided.

But there is another obvious angle to this particular case that was also overlooked, and that is the way in which UN resolutions are used selectively to justify US policy decisions.

UN resolutions routinely call, among other things, for the withdrawal of foreign troops from other countries. And given that the UN is, for want of anything better, the closest thing we have to providing a global consensus, such resolutions should be taken seriously.

But this is not the first time that UN resolutions calling for the withdrawal of occupying troops to be withdrawn have been defied. For example, Stephen Zunes, professor of Politics and chair of the Peace & Justice Studies Program at the University of San Francisco in his article US Double Standards in the October 22, 2002 issue of The Nation magazine says that more than ninety UN resolutions are currently being violated, and the vast majority of the violations are by countries closely allied with the US. He says:

For example, in 1975, after Morocco’s invasion of Western Sahara and Indonesia’s invasion of East Timor, the Security Council passed a series of resolutions demanding immediate withdrawal. However, then-US ambassador to the UN Daniel Patrick Moynihan bragged that “the Department of State desired that the United Nations prove utterly ineffective in whatever measures it undertook. The task was given to me, and I carried it forward with no inconsiderable success.” East Timor finally won its freedom in 1999. Moroccan forces still occupy Western Sahara. Meanwhile, Turkey remains in violation of Security Council Resolution 353 and more than a score of resolutions calling for its withdrawal from northern Cyprus, which Turkey, a NATO ally, invaded in 1974.

The most extensive violator of Security Council resolutions is Israel. Israel’s refusal to respond positively to the formal acceptance this past March by the Arab League of the land-for-peace formula put forward in Security Council Resolutions 242 and 338 arguably puts Israel in violation of these resolutions, long seen as the basis for Middle East peace. More clearly, Israel has defied Resolutions 267, 271 and 298, which demand that it rescind its annexation of greater East Jerusalem, as well as dozens of other resolutions insisting that Israel cease its violations of the Fourth Geneva Convention, such as deportations, demolition of homes, collective punishment and seizure of private property. Unlike some of the hypocritical and meanspirited resolutions passed by the UN General Assembly, like the now-rescinded 1975 resolution equating Zionism with racism, these Security Council resolutions are well grounded in international law and were passed with US support or abstention. Security Council Resolutions 446, 452 and 465 require that Israel evacuate all its illegal settlements on occupied Arab lands.

All the UN resolution pointed to be Zunes are very serious and are much older that the resolution 1559 being used against Syria, so that these violations are long standing. All this information is in the public record. Any reasonably competent journalist should know it and, when the administration (and this is done by both Republican and Democratic administrations) cynically invokes UN resolutions selectively to achieve narrow political ends, should be able to pose the relevant question of why only some UN resolutions have to be followed while others ignored.

But the mainstream journalists don’t do this. One question is why. But the more important question is, since they don’t do their job, what can we do to make up for it?

Evolutionary theory and falsificationism

In response to a previous posting, commenter Sarah Taylor made several important points. She clearly articulated the view that evolutionary theory is a complex edifice that is built on many observations that fit into a general pattern that is largely chronologically consistent.

She also notes that one distinguishing feature of science is that there are no questions that it shirks from, that there are no beliefs that it is not willing to put to the test. She says that “What makes scientific theories different from other human proposals about the nature of the universe are their courage. They proclaim their vulnerabilities as their strengths, inviting attack.�

I would mostly agree with this. Science does not shy away from probing its weaknesses, although I would not go so far as to claim that the vulnerabilities are seen as strengths. What is true is that the ‘weaknesses’ of theories are not ignored or covered up but are seen as opportunities for further research. Since there is no such thing in science as infallible knowledge, there is no inherent desire to preserve any theory at all costs, and the history of science is full of once dominant theories that are no longer considered credible.

But having said all that, it is not necessarily true that finding just one contradiction with a theory is sufficient to overthrow the theory. In the context of the challenge to Darwinian theory by intelligent design (ID) advocates, Sarah’s statement that “All that any ID devotee has to do is to show ONE fossil ‘out of place’, to prove the theory doesn’t work. Just one horse shoulder blade in a Cambrian deposit somewhere in the world, and we can say goodbye to Darwinâ€? is a little too strong.

Sarah’s view seems to be derived from the model of falsificationism developed by the philosopher of science Karl Popper (see his book Conjectures and Refutations: The Growth of Scientific Knowledge), 1963) who was trying to explain how science progresses. After showing that trying to prove theories to be true was not possible, Popper argued that what scientists should instead do is try to prove theories false by finding a single counter-instance to the theory’s predictions. If that happens, the theory is falsified and has to be rejected and replaced by a better one. Hence the only status of a scientific theory is either ‘false’ or ‘not yet shown to be false.’

But historians of science have shown that this model, although appealing to our sense of scientific bravado, does not describe how science actually works. Scientists are loath to throw away perfectly productive theories on the basis of a few anomalies. If they did so, then no non-trivial theory would survive. For example, the motion of the perigee of the moon’s orbit disagreed with Newton’s theory for nearly sixty years. Similarly the stability of the planetary orbits was an unsolved problem for nearly 200 years.

Good theories are hard to come by and we cannot afford to throw them away at the first signs of a problem. This is why scientists are quite agreeable to treating such seeming counter-instances as research problems to be worked on, rather than as falsifying events. As Barry Barnes says in his T.S. Kuhn and Social Science (1982): “In agreeing upon a paradigm scientists do not accept a finished product: rather they agree to accept a basis for future work, and to treat as illusory or eliminable all its apparent inadequacies and defects.�

Dethroning a useful theory requires an accumulation of evidence and problems, and the simultaneous existence of a viable alternative. It is like a box spring mattress. One broken spring is not sufficient to make the mattress useless, since the other springs can make up for it and retain the mattress’s functionality. It takes several broken springs to make the mattress a candidate for replacement. And you only throw out the old mattress if you have a better one to replace it with, because having no mattress at all is even worse. The more powerful and venerable the theory, the more breakdowns that must occur to make scientists skeptical of its value and open to having another theory replace it.

After a theory is dethroned due to a confluence of many events, later historians might point to a single event as starting the decline or providing the tipping point that convinced scientists to abandon the theory. But this is something that happens long after the fact, and is largely a rewriting of history.

So I do not think that finding one fossil out of place will dethrone Darwin. And ID does not meet the necessary criteria for being a viable alternative anyway, since it appeals to an unavoidable inscrutability as a factor in its explanatory structure, and that is an immediate disqualification for any scientific theory.

The purpose of college

Why go to college?

For some, college is just a stage in the educational ladder after high school and before entering the working world or going to graduate school. In this view, college is primarily the place where you obtain an important credential that is the pre-requisite for securing well-paying jobs. This is not an insignificant consideration.

Others might see college as the place where you both broaden and deepen your knowledge in a range of subjects and develop higher-order skills such as critical thinking and writing and researching skills.

All these things are undoubtedly valuable and worth pursuing. But for me, I think the primary purpose of college is that it is the place where you start to lay the foundations for a personal philosophy of life.

What I mean by this is that at least in college we need to start asking ourselves the question: “Why do I get up in the morning?” For some, the answer might be “Why not? What other option is there?” For others it might just be a habit that is unquestioned. For yet others, it might be that they have particular ambitions in life that they want to achieve. For yet others, it might be because other people depend on us to do various things.

But while all these considerations undoubtedly play a part for all of us, the question that I am addressing goes somewhat beyond that and asks what we think of as our role in the universe. What is it that gives our lives meaning? What should be the basis of our relationships with our family and friends and society? What is our obligation to all those to whom we are tied together by a common humanity? What should be our relationship with nature and the environment?

All of us think about these things from time to time. But I suspect that these various areas of our lives remain somewhat separate. By ‘developing a personal philosophy of life’, I mean the attempt to pull together all these threads and weave a coherent tapestry where each part supports and strengthens the other.

I think that the university is a wonderful place to start doing this because it has a unique combination of circumstances that can, at least in principle, enable this difficult task to be pursued. It has libraries, it has scholars, it has courses of study that can enable one to explore deeply into areas of knowledge. It provides easy access to the wisdom of the past and to adventures towards the future. But most importantly, it has people (students and staff and faculty) of diverse backgrounds, ages, ethnicities, nationalities, gender, etc.

But I wonder if we fully take advantage of this opportunity or whether the day-to-day concerns of courses, homework, research, teaching, studying prevent us from periodically stepping back and trying to see the big picture. In fact, it looks like the search for broader goals for college education is declining alarmingly. In 1969, 71% of students said they felt it essential that college help them in “formulating the values and goals of my life.” 76% also said that “learning to get along with people” was an essential goal of their college experience.

But by 1993, those percentages had dropped to 50% and 47% respectively, from the top ranked items to the bottom, being displaced by an emphasis on training and skills and knowledge in specialized fields. (Source: When Hope and Fear Collide by Arthur S. Levine and Jeannette S. Cureton, 1998, table 6.1, page 117.)

In my mind, this is an alarming trend and needs to be reversed.

One thing that events like the tsunami do, even for those not directly affected by it, is to bring us up short, to realize the fragility of life and the importance of making the most out of our time here. It reminds us that there are big questions that we need to ask and try to answer, and we cannot keep avoiding them.

This kind of thoughtful introspection mostly occurs outside formal classes, in the private discussions that we have in informal settings, in dorms, lounges, parks, offices, and coffee shops. But how often does it happen? And how can we create a university atmosphere that is conducive to making people realize the importance of having such discussions?

The meaning that we attach to life will depend on a host of individualized factors, such as our personal histories, what we value most, and what we are willing to give up. And we may never actually create a fully formed personal philosophy of life. The philosophy we do develop will most likely keep changing with time as our life experiences change us.

But the attempt to find out what our inner core is so that we act in life in ways that are consistent with it is something that I think college is perfectly suited for. I only hope that most people take advantage of it.

A Theory of Justice

I have to confess that this blog has been guilty of false advertising. On the masthead, of all the items listed, the one thing I have not talked about is books and it is time to make amends.

But first some background. Last week, I spent a thoroughly enjoyable evening having an informal conversation with about 20 students in the lobby of Alumni Hall (many thanks to Carolyn, Resident Assistant of Howe for organizing it). The conversation ranged over many topics and inevitably came around to politics. I had expressed my opposition to the attack on Iraq, and Laura (one of my former students) raised the perfectly legitimate question about what we should do about national leaders like Saddam Hussein. Should we just let them be? My response was to say that people and countries need to have some principles on which to act and apply them uniformly so that everyone (without exception) would be governed by the same principles. The justifications given by the Bush administration for the attack on Iraq did not meet those conditions.

But my response did not have a solid theoretical foundation and I am glad to report that a book that I have started reading seems to provide just that.

The book is A Theory of Justice by John Rawls, in which the author tries to outline what it would take to create a system that would meet the criteria of justice as fairness. The book was first published in 1971 but I was not aware until very recently of its existence. I now find that it is known by practically everyone and is considered a classic, but as I said elsewhere earlier, my own education was extraordinarily narrow, so it is not surprising that I was unaware of it until now.

Rawls says that everyone has an intuitive sense of justice and fairness and that the problem lies on how to translate that desire into a practical reality. Rawls’ book gets off to a great start in laying out the basis for how to create a just society.

“Men are to decide in advance how they are to regulate their claims against one another and what is to be the foundation charter of their society…Among the essential features of this situation is that no one knows his place in society, his class position or social status, not does anyone know his fortune in the distribution of natural assets and abilities, his intelligence, strength, and the like…The principles of justice are chosen behind a veil of ignorance.” (my emphasis)

In other words, we have to decide what is fair before we know where we will fit into society. We have to create rules bearing in mind that we might be born to any of the possible situations that the ensuing structure might create. Right now what we have is ‘victor’s justice’, where the people who have all the power and privilege get to decide how society should be run, and their own role in it, and it should not surprise us that they see a just society as one that gives them a disproportionate share of the benefits.

Rawls argues that if people were to decide how to structure society based on this ‘veil of ignorance’ premise, they would choose two principles around which to organize things. “[T]he first requires equality in the assignment of basic rights and duties, while the second holds that social and economic inequalities, for example, inequalities of wealth and authority, are just only if they result in compensating benefits for everyone, and in particular for the least advantaged members of society. These principles rule out justifying institutions on the grounds that the hardships of some are offset by a greater good in the aggregate.”

Rawl’s argument has features similar to that young children use when sharing something, say a pizza or a cookie. The problem is that the person who gets to choose first has an unfair advantage. This problem is overcome by deciding in advance that one person divides the object into two portions while the other person gets first pick, thus ensuring that both people should feel that the ensuing distribution is fair.

(Here is an interesting problem: How can you divide a pizza in three ways so that everyone has the sense that it was a fair distribution? Remember, this should be done without precision measurements. The point is to demonstrate the need to set up structures so that people will feel a sense of fairness, irrespective of their position in the selection order.)

All this great stuff is just in the first chapter. Rawls will presumably flesh out the ideas in the subsequent chapters and I cannot wait to see how it comes out.

I will comment about the ideas in this book in later postings as I read more, because I think the ‘veil of ignorance’ gives a good framework for understanding how to make judgments about public policy.

Where was God during the tsunami?

Last Thursday I moderated a panel discussion (sponsored by the Hindu Students Association and the Religion Department at Case) on the topic of theodicy (theories to justify the ways of God to people, aka “why bad things happen to good people�) in light of the devastation wreaked by the tsunami, which killed an estimated quarter million people.

The panel comprised six scholars representing Judaism, Islam, Jainism, Christianity, Hinduism, and Buddhism and the discussion was thoughtful with a good sharing of ideas and concerns.

As the lay moderator not affiliated with any religious tradition, I opened by saying that it seemed to me that events like the tsunami posed a difficult problem for believers in a God because none of the three immediate explanations that come to mind about the role of God are very satisfying. The explanations are:

  1. It was an act of commission. In other words, everything that happens is God’s will including the tsunami. This implies that God caused it to happen and hence can be viewed as cruel.
  2. It was an act of omission. God did not cause the tsunami but did nothing to save people from its effects. This implies that God does not care about suffering.
  3. It is a sign of impotence. God does care but is incapable of preventing such events. This implies that God is not all-powerful.

These questions can well be asked even for an isolated tragic event like the death of a child. But in those cases, it is only the immediate relatives and friends of the bereaved who ask such things. The tsunami caused even those not directly affected to be deeply troubled and it is interesting to ask why this is so.

Some possible reasons for this widespread questioning of religion are that the tsunami had a very rare combination of four features:

  1. It was a purely natural calamity with no blame attached to humans. Other ‘natural’ disasters such as droughts and famines can sometimes be linked indirectly to human actions and blame shifted from God.
  2. The massive scale of death and suffering.
  3. The rapidity of the events, the large number of deaths on such a short time-scale.
  4. The innocence of so many victims, evidenced by the fact that a staggering one-third of the deaths were of children.

Of course, although rare, such combinations of factors have occurred in the past and all the major religions are old enough to have experienced such events before and grappled with the theological implications. It was interesting to see the different ways in which the four theistic religions (Judaism, Hinduism, Christianity, and Islam) and the two non-theistic religions (Buddhism and Jainism) responded. But whatever the religion, it was clear that something has to give somewhere in the image of an all-knowing, all-powerful, benevolent God, whose actions we can comprehend.

As one panelist pointed out, the last feature (of the ability to comprehend the meaning of such events) is dealt with in all religions with an MWC (“mysterious ways clause�) that can be invoked to say that the actions of God are inscrutable and that we simply have to accept the fact that a good explanation exists, though we may not know it.

Each panelist also pointed out that each religious tradition is in actuality an umbrella of many strands and that there is no single unified response that can be given for such an event. Many of the explanations given by each tradition were shared by the others as well. In some ways, this diversity of explanations within each tradition is necessary because it is what enables them to hold on to a diverse base of adherents, each of whom will have a personal explanation that they favor and who will look to their religion for approval of that particular belief.

The possible explanations range over the following: that things like the tsunami are God’s punishment for either individual or collective iniquity; that they are sent to test the faith of believers (as in the Biblical story of Job); that God created natural laws and lets those laws work their way without interference; that God is “playing� with the world to remind us that this life is transitory and not important; that the tsunami was sent as a sign that the “end times� (when the apocalypse arrives) are near and hence should actually be seen as a joyous event; that it was a sign and reminder of God’s power and meant to inspire devotion; it was to remind us that all things are an illusion and that the events did not “really� happen.

(Update: Professor Peter Haas, who spoke about Judaism, emails me that I had overlooked an important aspect of that religious tradition. He says that: “My only comment would be that you did not quite capture my point about Judaism, which was that the real question is less about WHY things like the Tsunami happened but about how we are to respond to such human suffering given that we live in a world where such things happen.”)

All of these explanations posit a higher purpose for the tsunami, and some definitely relinquish the notion of God’s benevolence.

The non-theistic religions have as their explanatory core for events the notion of karma. Karma is often loosely thought of as fate but the speakers pointed out that karma means action and carries the implication that we are responsible for our actions and that our actions create consequences. Thus there is the belief in the existence of cause-and-effect laws but there is no requirement for the existence of a law-giver (or God). The karma itself is the cause of events like the tsunami and we do not need an external cause or agent to explain it. The MWC is invoked even in this case to say that there is no reason to think that the ways the karmic laws work are knowable by humans.

The non-theistic karma traditions do not believe in the existence of evil or an evil one. But there is a concept of moral law or justice (“dharma�) and the absence of justice (“adharma�), and events like the tsunami may be an indication that total level of dharma in the world is declining. These traditions also posit that the universe is impermanent and that the real problem is our ignorance of its nature and of our transitory role in it.

The problem for the karma-based religions with things like the tsunami is understanding how the karma of so many diverse individuals could coincide so that they all perished in the same way within the space of minutes. But again, the MWC can be invoked to say that there is no requirement that we should be able to understand how the karmic laws work

(One question that struck me during the discussion was that in Hinduism, a belief in God coexists with a belief in karma and I was not sure how that works. After all, if God can intervene in the world, then can the karmic laws be over-ridden? Perhaps someone who knows more about this can enlighten me.)

(Update: Professor Sarma, who spoke on Hinduism, emails me that: “As for the inconsistencies in Hinduism –there are lots of traditions which are classified under the broad rubric “Hinduism” so the attempt to characterize a unified answer is inherently flawed.”)

Are any of these explanations satisfying? Or do events like the tsunami seriously undermine people’s beliefs in religion? That is something that each person has to decide for himself or herself.

Urban legends in academia?

Did you hear the story about the college professor who asked his class to write a mid-term essay on “Why George Bush is a war criminal,� and then gave an F grade to a student who had been offended by the assignment and had instead turned in one on “Why Saddam Hussein is a war criminal�?

I wrote about this in an op-ed piece that appeared in today’s (March 4, 2005) Plain Dealer.

You will be asked by the site to fill in your zip-code, year of birth, and gender for some kind of demographic survey. It takes about 10 seconds.

Update on 3/14/05

I received a call today from a person associated with Students for Academic Freedom informing me that this op-ed had triggered the release of more information on their website, where more details are given.

Although the student referred to had not in fact given this testimony at the Colorado Senate hearings as had been alleged earlier, the level of detail (which had not been released until now) provided on the SAF website is sufficient to remove this story from the category of urban legends since it does give some names and places and dates. But a judgment on whether this constitutes academic bullying will have to await the release of the facts of the case on what actually transpired between professor and student. My contact at SAF says that the incident is still under investigation.

Update on the update (3/15/05): It gets curioser and curioser.

The blog Canadian Cynic reports that new information on this case has come out and that Horowitz is now backtracking on almost all of the key charges that were originally made. Canadian Cynic highlights Horowitz’s statements now that “Some Of Our Facts Were Wrong; Our Point Was Right” and “”I consider this an important matter and will get to the bottom of it even if it should mean withdrawing the claim.”

See the article on the website Inside Higher Education. It seems to be the most authoritative source of information on this case.

Content-free political labels

Here’s a quiz. Who said the following:

“In his inaugural address, Mr. Bush calls 9/11 the day “when freedom came under attack.� This is sophomoric. Osama did not send fanatics to ram planes into the World Trade Center because he hates the Bill of Rights. He sent the terrorists here because he hates our presence and policies in the Middle East.

…

The 9/11 killers were over here because we are over there. We were not attacked because of who we are but because of what we do. It is not our principles they hate. It is our policies. U.S. intervention in the Middle East was the cause of the 9/11 terror. Bush believes it is the cure. Has he learned nothing from Iraq?

In 2003, we invaded a nation that had not attacked us, did not threaten us, and did not want war with us to disarm it of weapons it did not have. Now, after plunging $200 billion and the lives of 1,400 of our best and bravest into this war and killing tens of thousands of Iraqis, we have reaped a harvest of hatred in the Arab world and, according to officials in our own government, have created a new nesting place and training ground for terrorists to replace the one we lately eradicated in Afghanistan.”

Was this said by some radical leftist? Some long-haired peacenik? Ward Churchill? Actually, it was Pat Buchanan, a staffer for Richard Nixon and long-time Republican stalwart writing in a recent issue of the magazine The American Conservative.

Ok, here’s another writer:

“The US economy is headed toward crisis, and the political leadership of the country–if it can be called leadership–is preoccupied with nonexistent weapons of mass destruction in the Middle East.

…

Oblivious to reality, the Bush administration has proposed a Social Security privatization that will cost $4.5 trillion in borrowing over the next 10 years alone! America has no domestic savings to absorb this debt, and foreigners will not lend such enormous sums to a country with a collapsing currency–especially a country mired in a Middle East war running up hundreds of billions of dollars in war debt.

A venal and self-important Washington establishment combined with a globalized corporate mentality have brought an end to America’s rising living standards. America’s days as a superpower are rapidly coming to an end. Isolated by the nationalistic unilateralism of the neoconservatives who control the Bush administration, the US can expect no sympathy or help from former allies and rising new powers.â€?

Who is this Bush-hater? Michael Moore? No, it was none other than Paul Craig Roberts, Assistant Secretary of the Treasury in the Reagan administration and former Associate Editor of the Wall Street Journal editorial page and Contributing Editor of National Review.

The point of my using these quotes is to illustrate my view that the labels ‘liberal’ or ‘conservative,’ ‘Democratic’ or ‘Republican’ have ceased to be meaningful in identifying people’s political positions on many issues. They may have at one time identified particular unifying political philosophies, but now have ceased to have content in that there are no longer any clear markers that one can point to that identify those positions.

Not all political labels have ceased to have content but those four broad-brush categories in particular are used more as terms of political abuse than for any clarifying purpose. Their only purpose is to set up fake debates on television’s political yell shows. If you advertise that you have a liberal and conservative on your panel (or a Democrat and Republican), you can claim that your program is ‘fair and balanced’ even though both people pretty much say the same thing on major policy issues, differing only on minor tactical points or on style.

It makes more sense, rather than identifying and aligning with people on the basis of these meaningless labels, to form alliances on specific issues based on where they stand with respect to those issues. And when one does that, one finds that many of the old divisions melt away.

The greater danger of labels (whether they be of religion, nationality, or politics) is that they are used to divide us and herd us into boxes and make us think in terms of what we should believe and who are allies should be than what we really want them to be. They are being used as weapons to divide people into ineffective warring factions and thus prevent them from finding commonalities that might lead to concerted action.

I do not agree with Buchanan or Roberts on everything they say. On some things I strongly disagree. But unlike the members of the Third-Tier Punditâ„¢ brigade who should be ignored, they are serious people who often have useful information or perspectives to share and I read them regularly.

Dismissing the ideas of some people simply because of the label attached to them makes as little sense as supporting other people for the same reason.

Putting thought police in the classroom

Most of you would have heard by now about the bill pending in the Ohio legislature (Senate Bill 24) to “establish the academic bill of rights for higher education.�
The bill is both silly and misguided. It mixes motherhood and apple pie language (“The institution shall provide its students with a learning environment in which the students have access to a broad range of serious scholarly opinion pertaining to the subjects they study.�) with language that is practically begging students with even minor grievances to complain to higher authorities.

In a previous posting, I spoke about how lack of trust leads to poor learning conditions and that we need to recreate the conditions under which trust can flourish. This bill goes in the wrong direction because it effectively creates a kind of ‘thought police’ mentality, where any controversial word or idea in class can end up causing a legal battle.

Let me give you an example. The bill says “curricula and reading lists in the humanities and social studies shall respect all human knowledge in these areas and provide students with dissenting sources and viewpoints.�

As an instructor, how would I respect “all� the knowledge in the area? What do we even mean by the word “knowledge.� How do we even separate knowledge in the humanities and social sciences from those in the sciences? What constitutes “dissenting viewpoints?� And how far should “dissenting� be taken? If a particular point of view is not mentioned by the instructor, is that grounds for complaint?

Take another example.

“Students shall be graded solely on the basis of their reasoned answers and appropriate knowledge of the subjects and disciplines they study and shall not be discriminated against on the basis of their political, ideological, or religious beliefs.�

Grading is an art not a science. It is, at some level, a holistic judgment made by an instructor. To be sure the instructor has a deep ethical obligation to the profession to assign the grade with as much competence and impartiality as he or she can muster. But even granting that, a letter grade or a point allocation for an assignment is not something that can be completely objectified. Give the same essay or problem to two different teachers and they will likely arrive at different grades even if it were “graded solely on the basis of their reasoned answers and appropriate knowledge.� And this can occur irrespective of how agreeable or disagreeable the student’s views might be perceived by the instructor. So if a student complains about a grade, how can this be adjudicated?

As I said in a previous posting, the reason we currently have so many rules in our classrooms is that we seem to have forgotten the purpose of courses, and have lost that sense of trust that is so vital to creating a proper learning atmosphere.

This bill, rather than increasing trust in the classroom, will decrease it. Because as soon as there is legislation prescribing what can and cannot be done in the classroom, it will inevitably lead to teaching and grading issues ending up in the courtroom. And in order to avoid that tedious and expensive process, universities will start instituting detailed lists of rules about what can and cannot be done in the classroom, and teachers will start teaching and assessing defensively, simply to avoid the chance of litigation.

Is this what we want or need?

POST SCRIPT

Tomorrow (Thursday, March 3) from 7:00-9:00 pm in Thwing Ballroom, Case’s Hindu Students Association is hosting an inter-religious dialogue on how to reconcile a belief in God in light of major disasters like the recent tsunami.

There will be a panel of religious scholars representing all the major religious traditions (drawn from the faculty of the Religion department at Case and elsewhere) and plenty of time for discussions. I will be the moderator of the discussion.

The event is free and open to the public and donations will be accepted for tsunami relief efforts.

Living in a reality-free world

Here is some news to curl your hair.

The Harris Poll® #14 of February 18, 2005 reports that:

– 47 percent of Americans believe that Saddam Hussein helped plan and support the hijackers who attacked the U.S. on September 11, 2001;
– 44 percent believe that several of the hijackers who attacked the U.S. on September 11 were Iraqis; and
– 36 percent believe that Iraq had weapons of mass destruction when the U.S. invaded that country.

Virtually no one who has followed these stories believes any of the above to be true. And this poll was released just last week, long after the David Kay and Charles Duelfer reports were made public, putting to rest all the overblown claims that were used to justify the attack on Iraq.

Also something that experts do believe to be true, that Saddam Hussein was prevented from developing weapons of mass destruction by the U.N. weapons inspectors, is supported by only 46 percent.

How is it that so many Americans seem to be living in a reality-free world?

The reason is that such falsehood as the ones listed above are strongly implied by influential people and uncritically reported in the media, or influential people stay silent when such falsehoods are propagated.

Take for example a speech made just last week (on February 17, 2005) by California congressman Christopher Cox at the Conservative Political Action Conference. Michelle Goldberg of Salon was at the conference and reports his exact words: “We continue to discover biological and chemical weapons and the facilities to make them inside of Iraq, and even more about their intended use, including that a plan to distribute sarin, and the lethal poison ricin — in the United States and Europe — was actively being pursued as late as March 2003.â€?

And who were the members of the audience who did not contradict Cox as this nonsense was being spouted? Michelle Goldberg reports that among those “seated at the long presidential table at the head of the room were Henry Hyde, chairman of the House International Relations Committee, Kansas Senator Sam Brownback, Missouri Senator Norm Coleman, Dore Gold, foreign policy advisor to former Israeli Prime Minister Benjamin Netanyahu, and NRA president Kayne Robinson.� Cox’s comments were made while introducing Vice President Dick Cheney, who gave the keynote address.

Now it is possible to carefully deconstruct the congressman’s words so that some semblance of truth can be salvaged. But that would involve re-defining words like ‘discovered’ and ‘weapons’ and ‘facilities’ and ‘plan’ in ways that would make Clinton’s parsing of the word ‘is’ seem like a model of transparency.

So what are we to make of political leaders who can say such deliberately misleading things? What are we to make of other politicians who know the facts but choose to remain silent while the public is led astray? And what are we to make of the national media who spend enormous amounts of time and space on issues like Michael Jackson’s trial but do not provide the kind of scrutiny, factual information, and context that would make politicians more cautious about what they say?

Politicians who mislead the public may be just cynical in that they know the truth and are just saying things for the sake of political expediency. But the danger with allowing this kind of talk to go unchallenged is that it creates an echo-chamber in which people hear the same false things from different directions and start to think it must be true. When people start believing their own propaganda, then they have entered a reality-free zone and this can lead to disastrous consequences.

George Orwell in his essay Politics and the English Language (1946) wrote “Political language…is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.� The sad truth is that Cox’s speech is by no means the only, or even the worst, example of this kind of linguistic chicanery. One has only to go back to the days leading up to the invasion of Iraq to see even more egregious examples of deception by the highest ranking members of the government, and timidity and silence from the supposed watch-dogs in the Congress and media.

Is it any wonder that so many people live in a world that does not exist?

The importance of trust in the classroom

The more I teach, the more I feel that there is an inverse correlation between the quality of learning that occurs and the number of rules that govern the classroom. At its best, teaching involves trust between students and teacher, and among fellow students. The assumption should be that we are all there to learn and that we will help each other learn.

To be sure, the teacher has a responsibility to the students and the institution he or she works for to ensure that learning is occurring and that the unavoidable grades that have gained a stranglehold in our educational world are assigned fairly.

But apart from this minimal expectation, I feel that there should be no other rules, except those that are created collectively by the entire class in order that things run smoothly. It is for this reason that my courses are becoming progressively rule-free over time. This is also why I oppose efforts to treat course syllabi as quasi-legal contracts and to mandate what they should and should not contain

But I know that I am swimming upstream on this one. Many course syllabi are becoming increasingly crammed full of rules and regulations. Why? To my mind, this is a measure of the lack of trust that has developed between student and teachers. Students and faculty don’t really know each other as people. We don’t see ourselves as having come together for an endeavor (learning) which should be enjoyable and from which all of us will benefit and which will form the basis of a lifetime relationship. Instead we seem to see ourselves as temporary acquaintances engaged in a commercial transaction. The faculty member has a product or service (knowledge, grades) that the student ‘purchases’ with money, time, and effort.

A natural consequence of this commerce mentality is the need for rules, just like those in the marketplace. Students seem to feel the need to have rules to protect themselves from arbitrary actions by faculty members who are strangers to them, and faculty feel the need to have rules to protect themselves from complaints by students whom they don’t really know. This dynamic inevitably leads to a spiral of increasing rules since having written rules at all implies a lack of trust, which then results in people testing the limits of the rules, which creates the need for more protective rules, which leads to even greater distrust, and so on.

But the reality is that there are only a tiny handful of faculty and students who might take unfair advantage of one another in the absence of a detailed set of rules. In my work in many universities, it is hard for me to recollect cases of faculty members who did not take seriously their ethical obligation to treat students fairly.

This does not mean that faculty members cannot be arrogant, condescending, and unrealistically demanding. We are, after all, human. But it is rare that a teacher will act out of spite against a specific student. And if it does happen, there are mechanisms in universities to try and redress these wrongs when they occur, because the other faculty members know that we can only succeed if we as a learning community try to uphold the highest standards.

We don’t have written rules of behavior among friends. We don’t have written rules of behavior among family members. The reason is that the common interests that bring us together are strong enough to make us want to resolve the issues in a manner of friendly give-and-take. Why is it that we do not even try to create a similar situation in class? Surely a common interest in learning is strong enough to serve a similar role?

When I think about what is the one change that I would recommend to dramatically improve education at all levels, I come to the conclusion that we must create a greater sense of trust in the classroom so that we can minimize the number of rules and thus allow the natural enjoyment that true learning provides to emerge.