Algeria and Iraq

I just saw a remarkable film The Battle of Algiers. Made in black and white (French with English subtitles) in 1966 by the Italian filmmaker Gillo Pontecorvo, the story is about the Algerian struggle for independence and the battle between the rebels and the French colonial powers in the capital city of Algiers in the period 1954-1960.

In order to deal with the increasing violence during this period, the French government sends in elite paratroopers led by Colonel Mathieu. Mathieu sets about ruthlessly identifying the structure of the insurgent network, capturing and torturing members to get information on others, and killing and blowing up buildings in his pursuit of the rebels even if it contains civilians. And yet, he is not portrayed as a monster. In one great scene where he is giving a press conference, he is asked about his methods of getting information and the allegations of torture. He replies quite frankly that the French people must decide if they want to stay in Algeria or leave, and if they want to halt the violence against them or let it continue. He says that if they want to stay and stop the violence, then they must be prepared to live with the consequences of how that is achieved. It is the French people’s choice.

One gets the sense that Mathieu does not torture and kill suspects because he enjoys it. He is simply an amoral man, who has been given a job to do and he will get it done using whatever means he deems necessary. This is the kind of military person that political leaders want. They don’t want people who worry about the niceties of human rights and human dignity. But when you train people to deny their normal human feelings, then you get the kind of people who carry out the tortures described in Abu Ghraib and Guantanamo, and who are even surprised when there is an outcry that what they did was wrong.

And Mathieu does succeed in his task, at least in the short run. By his ruthless methods he destroys the rebel network. But all that this buys is some time. After a lull in the violence for a couple of years, a sudden eruption of mass protests results in Algeria becoming independent in 1962. The French win the battle of Algiers but lose the war of independence.

The film gives a remarkably balanced look at the battle, avoiding the temptation to fall into easy clichés about good and evil. It shows the FLN (National Liberation Front) using women and children to carry out its bombing campaign against French civilians living in the French areas of the city. In one memorable sequence, three young Muslim women remove their veils, cut their hair, put on makeup, and dress like French women to enable them to carry bombs in their bags and pass through military checkpoints that surround the Muslim sector of the city (the Casbah). They place those bombs in a dance hall, coffee shop and Air France office, bombs that explode with deadly effect killing scores of civilians who just happen to be there.

In one scene:

Pontecorvo deals with the issue of the killing of innocents by an army vs. such killing by an irregular force. During a press conference, a reporter asks a captured official of the FLN: “Isn’t it a dirty thing to use women’s baskets to carry bombs to kill innocent people?” To which the official answers, “And you? Doesn’t it seem even dirtier to you to drop napalm bombs on defenseless villages with thousands of innocent victims? It would be a lot easier for us if we had planes. Give us your bombers, and we’ll give you our baskets.”

The parallels of Algeria and Iraq are striking, so much so that it is reported that the US policy makers and military viewed this film with a view to hoping to learn how to combat the Iraq insurgency.

As in Iraq, the rebels are Muslims and the objections they have to being ruled by non-Muslims plays an important role in their motivation to revolt. The French had just humiliatingly lost in Vietnam in 1954 and their military was anxious to rehabilitate their reputations by winning elsewhere. In other words, they had their own ‘Vietnam syndrome’ to deal with, just like the US.

In the film, you see how the ability of the insurgents to blend in with the urban population enables them to move around and carry out attacks on the French police and citizenry, with women and children playing important roles. We see how the privileged and western lifestyle of the French people in Algeria makes them easy targets for attacks. We see how the attacks on French people and soldiers in Algeria causes great fury amongst the French citizenry, causing them to condone the torture and killing and other brutal methods of the French troops.

One major difference between the French involvement in Algeria and US involvement is Iraq is that Algeria had been occupied by the French for 130 years, since 1830. They had been there for so long that they considered it part of France and refused to consider the possibility of independence. The long occupation also resulted in a significant number of French people living in the city of Algiers, thus making them vulnerable targets. In Iraq, there are very few US civilians and almost all of them are in the heavily fortified so-called ‘green zone.’

The film takes a balanced look at what an urban guerilla war looks like and those who wish to see what might be currently happening in cities like Ramadi and Falluja and Baghdad can get a good idea by seeing this film. The scenes of mass protest by huge crowds of Algerians and their suppression by the occupying French forces are so realistic that the filmmakers put a disclaimer at the beginning stating that no documentary or newsreel footage had been used. And amazingly, this realism was achieved with all novice actors, people who were selected off the streets of Algiers. Only the French Colonel Mathieu was played by a professional actor, but you would not believe it from just seeing the film since the actors give such natural and polished performances, surely a sign of a great director.

For a good analysis of the film and background on its director, see here. The film is available at the Kelvin Smith Library.

POST SCRIPT: Documentary about Rajini Rajasingham-Thiranagama

Today at 10:00 pm WVIZ Channel 25 in Cleveland is showing No More Tears Sister. I wrote about this documentary earlier.

The strange game of cricket

I am a lifelong fan of cricket and spent an enormous amount of my youth devoted (many would say wasted) to the game. As a boy, much of my free time was spent playing it, reading about it, watching it, or listening to it on the radio. I was such a devoted fan that I would set the alarm to wake up in the middle of the night to listen to crackly and indistinct short wave radio commentary of the games from distant time zones in England, Australia, and West Indies. Such was my fanaticism towards the game that I was going to all this trouble to listen to games involving other countries, Sri Lanka achieving international Test playing status only in 1981. And now with the internet, I have been able to renew my interest in the game since the lack of coverage in the US media is no longer a hindrance, so the time wasting has begun anew.

But the game seems to leave Americans cold, just like baseball is hard to appreciate for those from countries that do not play the game. I have become convinced that indoctrination into the joys of slow games like cricket or baseball is something that must occur very early in childhood and is difficult to develop later in life.

To help Americans to understand the game (and thus appreciate the film Lagaan even more), I’ll provide a brief description. For more details, see here.

It is easiest for Americans to understand the traditional form of the game by comparing its features with that of baseball, its closest relative.

The classical form of cricket consists of two teams of eleven players each (nine in baseball). Each team has two innings (nine in baseball). An inning ends when there are ten outs (three in baseball). As in baseball, the team that has the most runs at the end of the game wins.

There are two chief differences with baseball that give cricket its quirky features. The first is that, unlike baseball and like football, the game is time-limited. International games, called ‘Tests’, last for five consecutive days, six hours a day, with breaks for lunch and afternoon tea. Several shorter forms of the game also exist. In Lagaan, for example, the game is limited to three days and one inning for each side.

This time-limited feature means that even after five days, a game can (and often does) end in a no-decision because time has run out and the game ends before either or both teams have had a chance to complete their two innings. The thought that such a result is even possible, let alone not unusual, boggles the mind of Americans, who are used to obtaining a definite result.

The second distinctive feature is that the batsman (batter) who hits the ball is not obliged to run but has the option of choosing to stay put, running only if he is sure that he can complete the run safely. Because of this option, in theory it is possible for the first batsmen to stay out there for five full days, neither scoring runs nor getting out, and the game ending in a no-decision with no runs scored, no outs, and no innings completed by either side. This has never happened. This would be career suicide for the batsmen concerned. The crowds would riot if anyone tried this and their teammates would be incensed.

The reason this potentially possible scenario does not play out is a consequence of the combination of the natural competitive desire of players to win a contest, coupled with the time-limited nature of the game. In order to win, one side must score a high enough total of runs quickly so that they have sufficient time to get the other team out for fewer runs before time runs out. This requires each team to take chances to either score runs or to get the opponents out. It is this continuous balancing of risk with reward that gives the game most of its appeal and thrills.

It is only when winning is seen as a hopeless task for one side that it becomes a good (and even required) strategy to try and play safe for a no-decision. There have been many memorable games in which one side was hopelessly outscored in runs and had no chance to win but salvaged a no-decision by digging in and not trying to score runs but simply not allowing their opponents to get ten outs. That strategy is considered perfectly appropriate and in such situations those batsmen who successfully adopt this dour defensive strategy are hailed. Weather sometimes plays a role in creating no-decisions by reducing the time available for the game.

Paradoxically, the fact that batsmen are not obliged to run after hitting the ball results in cricket being a high scoring game (since they run only when it seems reasonably safe to do so) with a single innings by one team often lasting for more than a day and a five day game producing typically 1,500 runs or so.

A cricket field requires a large amount of land and consists of an elliptical shape about three to four times the size of a baseball field. Unlike in baseball, where the action takes place in one corner of the field where home plate is, cricket action takes place at the center of the field in a strip about 22 yards long, called the ‘pitch.’ There is no foul territory. At each end of the pitch are the ‘wickets’, three vertical sticks about knee high and spanning a width of about 9 inches. There are always two batsmen simultaneously on the pitch, with one batsman standing at each end. A bowler (pitcher) runs up and delivers the ball (with a straight arm delivery, no throwing allowed) from near the wickets at one end to the batsman guarding the wickets at the other end (the striker). If the ball hits the wicket, the batsman is out (said to be ‘bowled’), and replaced by the next one.

If the batsman hits the ball away from a fielder and decides it is safe to run, he and the batsman at the other end (the non striker) run towards the opposite wickets, crossing paths. If the ball is returned to either end, and the wicket there broken before the batsman running towards it reaches it, then that batsman is out (‘run out’). If the batsmen run an odd number (1,3,5) of runs safely then the non-striker becomes the striker. If an even number (2,4) of runs, then the same batsman retains the strike. Four runs are scored if the striker hits the ball so that it crosses the boundary of the field, and six runs are scored if it does so without first touching the ground. The boundary is not a high wall (as in baseball) but simply a line marked on the ground, usually by a rope.

In addition to getting out by being bowled or run out, a batsman is also out if a hit ball is caught by a fielder before it hits the ground. These are familiar forms of getting out to baseball fans but there are seven additional (and rarer) ways of getting out in cricket that are too complicated to get into here.

After one bowler has made six consecutive deliveries (called an ‘over’), the ball is given to a different bowler, who bowls an over from the opposite end, while the batsmen stay put during the changeover. Thus the non-striker at the end of one over becomes the striker at the beginning of the next.

(A fairly recent development has been that of one-day games where each team has just one inning that lasts for a maximum of 50 overs, with the team that scores the most runs winning. This format guarantees a result and aggressive batting, and has proven to be very popular with the general public, though cricket purists look down on it. An even shorter version consists of just 20 overs per side or 120 deliveries.)

To play cricket well requires developing a lot of technique (especially for batting) and thus fairly extensive coaching. Simply having raw talent is not sufficient to make it to the top. This is why the villagers in the film Lagaan, having never played the game before, faced such an uphill task in challenging the British team, who presumably had been playing the game since childhood.

I still enjoy watching cricket being played by good teams, although I no longer have the opportunity. There is no question that it is a strange game, and I can understand why newcomers to the game find its appeal highly elusive. It is slow moving and its delights are subtle. It is a game where good technique can give the spectator pleasure, even when displayed by the opponents. A batsman who hits the ball with style and grace, and a bowler whose run-up and delivery are fluid and skilful, and great fielding moves, tend to be appreciated by all spectators, not just those supporting that team.

Cricket is not a game that would have been invented in the modern age. It could only have been conceived in a different, more leisurely era, when people had the time and the money to while away whole days chasing after a ball on a grassy field. The fact that it has survived and even flourished in modern times, with more and more countries taking it seriously, is somewhat amazing.

POST SCRIPT: Class warfare in America

It always amazes me that it is the comedy shows that understand and report on policy best. Catch Stephen Colbert’s look at the minimum wage and class warfare.

Cricket and the politics of class

Whenever I read the novels of (say) Jane Austen or P. G. Wodehouse, that deal with the life of the British upper classes around the dawn of the twentieth century, one thing that always strikes me is that the characters who inhabit those books never seem to do any work. Beneficiaries of a class-ridden feudal system, they seem to live on inherited income and property that enables them to spend their days not having to worry about actually making a living. There is rarely any description of people’s jobs. Work seems to be something that the lower classes do and is vaguely disreputable. Even in Charles Dickens’ novel, which often dealt with characters who were desperately poor, the happy ending usually took the form of the hero obtaining wealth by means of an inheritance or otherwise, and then promptly stopping work and hanging around at home, even if they were still young

These rich people seemed to spend all their time visiting each other’s homes for weeks on end, go for walks, ride horses, write long letters to each other, play cards and board games, and throw elaborate parties. In short, these are rich idle people with plenty of time on their hands.

This kind of life was not entirely unproductive. Some of these people used their time to become amateur scientists, using their freedom from having to earn a living to devote their lives to studying scientific questions, often coming up with major discoveries. Charles Darwin’s voyage on the Beagle was not a job. He was not paid to go to the Galapagos Islands. His was an unpaid expedition, made possible by his lack of financial need. Nobel-prize winning physicist Lord Rayleigh was also somewhat of an amateur scientist. Even now the idea of the ‘gentleman scholar’ is quite strong in England, with people developing very detailed expertise in areas of knowledge on their own purely for the love of it and using their own money.

But many members of the idle rich classes were preoccupied with purely recreational activities and only such a class of people could have enjoyed the game of cricket. After all, who else has the time to play or watch a game that goes on for days on end? International games, called ‘Tests’, last for five consecutive days, six hours a day, with breaks for lunch and afternoon tea. Furthermore, the cricket field requires a large amount of land (an elliptical shape about three to four times the size of a baseball field), with carefully tended turf, and the equipment required to play is expensive, again making it a rich person’s game.

Despite the fact that the economics of the game and the time commitment it required made it hard for working people to play it, it gained in popularity and shortened versions of the game that lasted only one day enabled even working people to play it on Sundays, and eventually people even started being paid for playing the game. Such professionals were looked down upon by the amateurs, those who could play it without being paid to do so because they were independently wealthy. The latter learned the game at the exclusive private schools like Eton and Harrow and then later at prestigious universities like Oxford and Cambridge, and the ranks of the English national team tended to filled with the products of these elite institutions.

But the class system in England is very strong and even after professional players became part of cricket teams, some distinctions were maintained. In a manner strikingly similar to Jim Crow laws in the US (although not nearly as severe in intent or implementation), until the mid twentieth century, the amateurs (who were called ‘Gentlemen’) and the professionals (who were called ‘Players’) had separate dressing rooms and entered and left the cricket field by different entrances. Teams were usually captained by an amateur, even if the amateur was the worst player in terms of skill, presumably so that an amateur would not have to take orders from a professional. (Unlike in American sports where the non-playing coach or manager controls the game and tells players what to do, in cricket it is the playing captain makes all the decisions on the field and his order must be obeyed unquestioningly.) Len Hutton was an exception in that he was a professional who captained the England national team in the 1950s.
This Wikipedia entry shows the state of affairs as late as 1950, in a story about Brian Close who came from a working class background and in 1966 became the first professional (i.e. Player) to captain England after the amateur/professional distinction was finally and formally abolished in the mid-1960s.

At that time class status was still important: professionals, known as Players, were expected to show deference to the amateurs, who were the Gentlemen. Gentlemen did not share changing rooms with Players, and cricket scorecards would differentiate between the two of them, with the names of Gentlemen being prefixed “Mr”, the names of the professionals being styled by their surnames and then their initials. This was a time when it was considered necessary to announce on the tannoy errors such as “for F.J. Titmus read Titmus, F.J.“.

Close did well for the Players and top-scored with 65. When he reached his fifty, he was congratulated by the Gentlemen’s wicket-keeper, Billy Griffith, and in a conversation that now seems innocuous, Griffith congratulated Close by saying, “Well played, Brian”, with Close replying, “Thank you, Billy”. However, Close had not referred to Griffith as “Mister”, and ten days later was called to see Brian Sellers, a former captain and member of the Yorkshire committee, who reprimanded Close for the effrontery.

In societies that practice domination by one class or ethnicity over another, we often forget the important role that seemingly petty indignities play. In order to achieve complete domination over someone, it is not sufficient to just have total legal or even physical control over that person. It is important to also have psychological power and this is done by destroying their sense of dignity and self-worth. The British imperialists understood this well and never missed an opportunity to rub their ‘superiority’ in to their ‘inferiors,’ whether it was the people of their colonies or the working classes at home. People who have little or no dignity or sense of personal self-worth are defeated right from the start and thus easy to control.

This is why developing pride in oneself and dignity-building are usually an important part of getting any group to rise up and organize to improve themselves

The petty practices that arose from such an approach seem bizarre now, and thankfully most of us have not encountered such behavior. But it is sobering to realize that the worst such features were commonplace just fifty years ago and subtle forms still exist.

Next: So what is cricket all about anyway?

Lagaan and the Bollywood film tradition

In watching Lagaan, I was reminded of the increasing interest in the west in Bollywood films. For those not familiar with it, ‘Bollywood’ is a generic term for films produced mostly in the prolific studios of Mumbai (formerly Bombay), an industry that rivals Hollywood in size. But a Bollywood film is not merely defined by where it is produced but also by the nature of its content. (A caveat: I have never been a fan of Bollywood films and my following comments should not be too taken seriously because I have not seen many such films, and the few I did see were many, many years ago when I was an undergraduate in Sri Lanka. It is quite possible that my perceptions are out of date and that these films have changed and improved considerably over time.)

Bollywood films were immensely popular in Sri Lanka despite being in Hindi (a language not spoken there) and with no subtitles. The lack of understanding of the dialogue did not seem to pose a problem for audiences because the strict formula and conventions of these films made the general features of plot transparent and the details immaterial. The formula required many things. The films had to be long, at least three hours. Cinema was the chief form of popular entertainment and poor people wanted their money’s worth. The films were also outrageously escapist. The male and female leads were invariably young and good looking and middle or upper class, with lifestyles beyond the reach of most of their audiences. The plot was always boy-meets-girl, boy-and-girl-lose-each-other, boy-and-girl-overcome-problems-and-get-married.

The plot usually involved some misunderstanding that could have been easily resolved if someone had simply spoken up at the right time, but inexplicably does not. A daft woman was often the culprit. Providing light relief and humor is a comic sub-plot, usually involving servants or working class or stupid people, that runs in parallel with the main story line. The villain in the film usually has no redeeming qualities. In fact, the main romantic leads and the villain lack complexity and depth of character, being just types. This makes it easy to root for the heroes and hiss the villain. It is usually the supporting characters who are allowed to display some complexity and development. And a Bollywood film must end happily, with the villain getting his (it is usually a man) just desserts.

And of course, there have to be songs. Lots of songs. Combined with dancing. Lots of dancing. These are combined into big production numbers that break out often for no discernible reason and seem to go on and on and serve no purpose in the story other than to jazz up the proceedings. The song-and-dance scenes usually involve rapid changes of clothes and location. Just within one song, the couple might be singing wearing one outfit in their local town, then the location will shift to London in another outfit, then to the Alps, then Tokyo, and so on. Why? No dramatic reason. Just to give the audience the sheer escapist pleasure of seeing the world’s tourist spots. The romantic leads sing and dance in parks and play peek-a-boo behind the trees.

The songs, songwriters, and the singers of the songs (called “playback singers”) are the actual stars of a Bollywood film. They are not seen and the actors lipsynch to them, transparently so. Little effort is made to match the actor’s own voice with that of the playback singers. It is not unusual in a big ensemble song-and-dance scene for several characters who have vastly different speaking voices to ‘sing’ different lines, while the same playback singer is used for both. Verisimilitude is not a high priority for Bollywood film audiences, who seem to subscribe to Duke Ellington’s dictum: “If it sounds good, it is good.”

Lagaan sticks to the Bollywood formula in many areas but deviates from it in significant ways. It is very long but it is a tribute to the screenwriters and director that I did not feel it dragging at all. There is no comic sub-plot. The song-and-dance numbers are still there but thankfully much fewer (I think there were only six) and they were integrated into the story and advanced the plot. In fact, the last song, a devotional one sung by the villagers during the night before the third and final day of the match when they had their backs to the wall and were asking god to help them, was extraordinarily beautiful and very moving.

One Bollywood tradition that was retained in Lagaan was that the male and female leads must always be good looking and well-groomed and very buff, whatever the circumstances. Here they play two young people in an impoverished village that is baking in the heat, suffering from drought, and the people close to starving. You would expect such people to look somewhat emaciated and haggard, and yet the two leads always look like they have just come from a spa, with hair in place, clean-shaven, clean clothes, and make up done just so. Only the supporting characters sweat and wear torn and shabby clothes.

Another tradition that was retained was that the villain had no redeeming qualities. Here the villain was the British Captain Russell who offered the wager that could not be refused. He always has a sneer on his face and never seems to miss an opportunity to be nasty. In order to do a trivial favor for the raja (prince), he insists that the raja (who is a vegetarian) must eat meat. He kicks and beats with a whip a villager who accidentally hurts his horse while shoeing him. He yells at a subordinate because he did not seem him salute. And he kills a deer and rabbit for fun. You can be sure that the director chose those particular animals for their cuteness appeal and to increase the repulsion of the audience. The closeups of those two animals just prior to their death show them looking like Bambi and Thumper. I am surprised that Russell was not shown kicking a puppy.

But all that pales before the unmistakable sign of Russell’s bad character, which is that he indulges in unsportsmanlike behavior at cricket! In British tradition, cricket is the ultimate venue for fair play and anyone who does not play by the spirit of the rules, let alone the letter, is undoubtedly a bad person. George Orwell in his essay Raffles and Miss Blandish highlights this peculiarly British belief that someone who is good at cricket and upholds its spirit of sportsmanship is automatically assumed to be a good person, whereas someone who acts unsportingly, let alone (gasp!) cheats at the game, is considered a bounder, a cad, a scoundrel, a blackguard, completely beyond the pale. (Raffles is a fictional character in British literature, a thief who uses his acceptance in high society and invitations to their parties to steal people’s valuable possessions from their homes. No one suspects him because he played for the English national cricket team so how could he possibly be a thief?) To do something, anything, that is branded as ‘not cricket’ is to be accused of violating the spirit of fair play.

Although Lagaan retains some of the Bollywood and cricket clichés, it is a tribute to the film that it is also able to rise above them and tell a good story well.

Next: Cricket and the class system.

POST SCRIPT: So that explains it

New Scientist magazine reports on the results of a new study that finds that “Overconfident people are more likely to wage war but fare worse in the ensuing battles”. It also finds that “Those who launched unprovoked attacks also exhibited more narcissism.”

The study, done by Dominic Johnson of Princeton University involving 200 volunteers playing war games, was published in the Proceedings of the Royal Society B.

Bertram Malle of the University of Oregon says that “the study raises worrying questions about real-world political leaders. “Perhaps most disconcerting is that today’s leaders are above-average in narcissism,” he notes, referring to an analysis of 377 leaders published in King of the Mountain: The nature of political leadership by Arnold Ludwig.”

Peter Turchin of the University of Connecticut comments that “One wishes that members of the Bush administration had known about this research before they initiated invasion of Iraq three years ago,” he adds. “I think it would be fair to say that the general opinion of political scientists is that the Bush administration was overconfident of victory, and that the Iraq war is a debacle.”

I think it is naïve to think that things might have been different if the Bush administration had known of this study. I can’t recall the source now but there was an earlier study that found that the prime reason that some people are so incompetent is that they are unaware that they are incompetent! They do not think that negative indicators apply to them and thus do not seek to improve themselves. Such a lack of realistic self-awareness seems to be a hallmark of the current leadership.

Lagaan

I recently watched the film Lagaan (2001) (Hindi and English with English subtitles) on DVD and was very impressed. Although the film is very long (3 hours, 45 minutes!) it did not drag at all which, for me, puts its director (Ashutosh Gowarikar) in the same class as David Lean (Lawrence of Arabia, Bridge on the River Kwai) as one of those rare filmmakers who can make me overcome my feeling that films should not exceed two hours, and preferably should be 90 minutes.

Lagaan takes place in a remote village region in India in 1893 during British colonial rule. The area has been hit by a drought for several years and the impoverished villagers are unable to pay the tax (‘lagaan’) to their British military rulers.

In seeking relief from the tax, some of the villagers try to ask for a temporary amnesty, but run afoul of the local British military head Captain Russell who, in a fit of pique because of a prior run-in with one of the villagers (the hero of the film) actually doubles the tax instead. When the appalled villagers protest, Russell raises the stakes even more. He says that he will now triple the tax but offers them the following wager: he will exempt the village from any tax at all for three years if the villagers can field a cricket team that beats the team comprised of the British military officers. Since the British officers grew up playing the game and even in India play cricket all the time, while the villagers have never even seen the game, the villagers seem doomed. But having no option but to agree to this unbalanced wager, the villagers set about trying to learn to play cricket within the three months allocated to them, and this preparation and the actual climactic game forms the main storyline of the film.

The villagers who form the cricket team are made up of Hindus, Muslims, Sikhs, handicapped, and members of the so-called ‘untouchable’ caste, and they have to learn to overcome their traditional animosities for the sake of the village. This rag-tag group, lacking proper equipment or coaching (except for some guidance from the sympathetic sister of Captain Russell who is appalled by her brother’s cruelty), have to resort to unorthodox training methods, such as catching chickens to improve their reflexes and fielding technique.

Clearly the cricket match is a metaphor for the independence struggle waged by India against the British, which resulted in the British being forced to leave in 1947. That struggle was a landmark in national liberation struggles, with people like Jawaharlal Nehru and Mahatma Gandhi successfully managing to forge a highly diverse and large population, riddled with religious, ethnic, language, caste, and class differences, into a cohesive force against a common enemy. Unfortunately, that unity was short-lived with ongoing Hindu-Muslim clashes, the partitioning into India and Pakistan, the Kashmir area still under dispute, Sikh dissatisfaction, the isolation of the so-called ‘untouchable; caste, and so on. But they managed to work together enough during crucial periods to make continuing British control impossible. Like the village cricket team being forced to learn how to play the game of their oppressor, the Indian independence leaders had to learn the ‘game’ of British politics and public opinion in order to advance their goals.

Cricket as a metaphor for the anti-imperialist and anti-colonial struggle against the British is extended when we realize that the demise of the British Empire after World War II correlated with the decline in the dominance of their cricket. Now India and Pakistan are dominant cricket nations, regularly beating England in international contests (called ‘Test’ matches), and two current players who are easily among the best batsmen of all time (Sachin Tendulkar of India and Brian Lara of West Indies) come from former British colonies. Sri Lanka also fields competitive international teams. While in Lagaan the villagers were totally ignorant of the game and amused by the Englishmen’s passion for what they considered a childish pastime, nowadays the Indian subcontinent has arguably the most enthusiastic cricket fans in the world and there is probably no corner, however remote, where children are not enthusiastically playing it.

You don’t really need to understand cricket in order to appreciate this fine film, but in a later posting, I will provide a Cliffs notes version for those who are bewildered by the appeal of this very strange game.

(Note: If you are a member of the Case community, you can borrow the Lagaan DVD from the Kelvin Smith Library.)

Next: Lagaan and the Bollywood film tradition.

POST SCRIPT: The real reason why the attack on Iraq was wrong

Periodically, some defender of the invasion of Iraq will resurrect the idea that Iraq did possess so-called weapons of mass destruction. The latest people to do this are Senator Rick Santorum and Congressman Peter Hoekstra, whose claims have been disavowed even by Defense Department officials and Fox News.

Although their claims have been discredited, it is easy for such discussions to obscure an important and fundamental fact. The immorality and illegality of the invasion of Iraq has nothing to do with whether such weapons existed or not so whether they are found or not is not central to the issue of whether the attack was justified. The war was wrong because Iraq had neither attacked nor even threatened to attack the US. What the US engaged in was an unjustified war of aggression.

What the lack of discovery of weapons shows is that the Bush administration lied even about their unjustified rationale for the attack.

Free will

Belief in a god rests on a foundation that requires one to postulate the existence of a mind/soul that can exist independently of the body (after all, the soul is assumed to live on after the physical death of the body) and freely make decisions. The idea that the brain is all there is, that is creates our consciousness and that the mind/soul are auxiliary products of that overall consciousness, strikes at the very root of belief in god.

So what about the role of free will? Where does that fit in with this? If the mind is an entity that exists independently of the brain and which can influence the brain, then one can think of free will as a product of the mind. But is free will compatible with the idea that the brain is all there is?

The idea that we have free will came under attack with the development of materialistic models of the universe. With the success of Newtonian physics in explaining and predicting the motion of celestial and terrestrial objects, and with the rise of a materialistic philosophy of nature (that everything consists of matter in motion under the influence of natural laws), it became inevitable for people to suppose that the mechanical universe was all there is.

According to the Newtonian model, all you needed to be able to predict the future state of an object was (1) exact knowledge of the current state of the object (known as the initial conditions), and (2) the forces of interaction between the object and its environment, because it these forces, and only these forces, that influenced its subsequent behavior. Since there was no reason to think that these two types of information were unknowable in principle, that implied the future of that object was predetermined. If everything that existed in the universe (including the brain and mind) had this same material basis and consisted of objects in motion, then the logical implication is that the future is predetermined.

Of course, the mere fact of predetermination did not imply that the future was predictable in practice. Since any object other than a few elementary particles is composed of a vast number of constituent elements such as atoms, no program of prediction can be actually carried out, simply because of the enormous complexity of the calculations involved. Since we are not able to predict the future with 100% accuracy in the absence of perfect information, the belief in an undetermined future for anything but elementary particles can be preserved from actual experimental contradiction.

But at a philosophical level, the fact that predetermination existed in the deterministic Newtonian word pretty much killed the idea of free will and the existence of an independent mind, and hence god. So in order to preserve those concepts, one has to find flaws in either or both of the two underpinnings of the Newtonian system given above.

One approach is to argue that we can never know all the forces acting on an object. This is essentially the idea behind the concept of god (or intelligent designer, which is the same thing) whose actions does not conform to any natural laws and hence can intervene in any system in unpredictable ways. There has been no real evidence that such an unknown and unpredictable force exists.

The other approach is to argue that we cannot know, even in principle, what the initial conditions are. This latter view actually has experimental support (at least in some situations) in quantum mechanics and the Heisenberg uncertainty principle, which says that there is an underlying limit (inherent in nature) that limits the precision with which we can know the initial state of a system. The quantum world is not totally unpredictable of course. In fact, there exists a very high degree of predictability but it is a statistical predictability that says that we can state with some certainty what will happen on average, but each individual event is unpredictable. A classical analog is the case of tossing a coin. If I toss a coin a million times, I can predict with a very high degree of confidence that the number of heads will be very close to 50%, but I have only a 50-50 chance of guessing the result on any individual toss. And as I said before, almost everything in nature is made up of a vast number of constituent elements so it is the average motions of all these things that actually matter. This is why the predictions of science tend to be so accurate.

But the fact that there is even this small inherent uncertainty in nature has led some religious scientists to argue that quantum mechanics provides a non-deterministic niche that allows god to act and they have seized on it. For example, Brown University biology professor Ken Miller is a devout Catholic who has been a very strong opponent of the intelligent design movement. In his book Finding Darwin’s God he reconciles his belief in god with his belief in the sufficiency of natural selection by invoking the uncertainty principle as the means by which god can act in the world and yet remain undetectable. He doesn’t actually suggest a mechanism, he just asserts that quantum mechanics allows a window through which god can act.

So in some sense, the uncertainty principle is playing the role that the pineal gland played for Descartes, providing a point of intersection for the intersection of the nonmaterial world with the material world.

Those, like Jeffrey Schwartz and William Dembski, who are looking for new ways to preserve their intelligent design idea, have also tried to use the uncertainty principle to create room for it.

Frankly, this is not convincing. Although the uncertainty principle does assert an inherent limit, set by nature, on some kinds of knowledge, the limitation is highly restricted in its operation, significant only for very small objects at very low temperatures, and does not allow for the wide latitude required to believe in the kind of arbitrary intervention of god in the physical world that is favored by religious people. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Last year Dr. Schwartz and two colleagues published a paper on their quantum theory in the Philosophical Transactions of the Royal Society B. They are not the first to try linking quantum mechanics to concepts of consciousness, but such efforts have failed to win over either physicists or neuroscientists, who discount the role that quantum effects would play at the size and temperature of the human brain. In discussions of consciousness, “the only reason people involve quantum mechanics is because of pure mysticism,” says Christof Koch, a professor of cognitive and behavioral biology at the California Institute of Technology.

Using the quantum mechanical uncertainty principle to sneak in god into the world is not tenable. Those who know anything about quantum mechanics, even those sympathetic to religion, see this as a futile maneuver, serving only to awe those who are intimidated by quantum mechanics.

Many other scientists have been highly critical of Dr. Schwarz; even some researchers interested in exploring spirituality discount his theory. The Templeton Foundation, a philanthropy devoted to forging links between science and religion, rejected a grant proposal by Dr. Schwartz, says Charles L. Harper Jr., senior vice president of the foundation. A cosmologist by training, Mr. Harper says the proposal was turned down because “it had to do with a lot of hocus-pocus on quantum mechanics.”

So that is where things stand. To retain a belief in god and free will and soul requires one to postulate not just one non-material entity (god) interacting with the material world, but to suggest that each one of us also possesses a non-material entity (the soul/mind) that exists independently of us and interacts only with our own material brain (and with no one else’s brain) in some unspecified way. The mind-body interaction must have a blocking mechanism that prevents such cross-over since, if one person’s mind/soul can interact with another person’s brain, that can cause all kinds of problems.

Is this a plausible picture? Again, plausibility is something that each person must judge. For me personally, it just seems far too complicated, whereas assuming that the brain is all there is makes things simple.

In my own case, I had already begun to seriously doubt the existence of god before I even thought about the brain/mind relationship. When I started looking closely at how the brain works, I became convinced that the idea of a mind that has an existence independent of the brain was highly implausible. The dawning realization that the brain is all there is sealed the conviction that there is no god.

POST SCRIPT: Running on empty

Money was hard to borrow in Sri Lanka when I was growing up. So we got used to the idea that we had to live within our means or have to (embarrassingly) borrow from friends and relatives. One of the things that took me a long time to get used to in the US was the ease of credit and that people would go so willingly and easily into debt, even for things like unnecessary luxury goods or taking vacations. I am still not used to that actually, even after all these years here. I cannot imagine borrowing money except for absolute necessities.

As we all know, the saving rate in America is non-existent and even (by some reports) negative, which means that as a whole, the people in the nation are spending more than they earn. We also know that the government is racking up huge budget deficits, and record-breaking debt.

Why is this happening? How long can it continue? Why is everyone seemingly oblivious to this?

Danny Schecter has created a new documentary In Debt We Trust: America before the bubble bursts (coming out in June 2006) where he talks about how the rise in debt is being deliberately driven by people who make money off increasing indebtedness.

You can read about it and see the trailer here.

What the neuroscience community thinks about the mind/brain relationship

The idea that the mind is purely a product of the material in the brain has profound consequences for religious beliefs, which depend on the idea of the mind as an independent controlling force. The very concept of ‘faith’ implies an act of free will. So the person who believes in a god is pretty much forced to reject the idea that the mind is purely a creation of the brain. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Pope John Paul II struck a similar theme in a 1996 address focusing on science, in which he said theories of evolution that “consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person.”

As I wrote yesterday, the flagging intelligent design creationism (IDC) movement seems to be hoping for some fresh energy to emerge from the work of psychiatric researcher Dr. Schwartz. Or at the very least they may be hoping that they can persuade the public that the mind does exist independently of the brain. But they are going to have a hard time getting traction for this idea within the neurobiology community. There seems to be a greater degree of unanimity among them about the material basis of the mind than there is among biologists about the sufficiency of natural selection.

Stephen F. Heinemann, president of the Society for Neuroscience and a professor in the molecular-neurobiology lab at the Salk Institute for Biological Studies, in La Jolla, Calif., echoed many scientists’ reactions when he said in an e-mail message, “I think the concept of the mind outside the brain is absurd.”

But the ability of the neurobiology community to do their work unfettered by religious scrutiny may be coming to an end as increasing numbers of people become aware of the consequences of accepting the idea that the mind is purely a product of the brain. People might reject this idea (and be attracted to the work of Dr. Schwartz), not because they have examined and rejected the scientific evidence in support of it, but because it threatens their religious views. As I discussed in an earlier posting, people who want to preserve a belief system will accept almost any evidence, however slender or dubious, if it seems to provide them with an option of retaining it. As the article says:

Though Dr. Schwartz’s theory has not won over many scientists, some neurobiologists worry that this kind of argument might resonate with the general public, for whom the concept of a soul, free will, and God seems to require something beyond the physical brain. “The truly radical and still maturing view in the neuroscience community that the mind is entirely the product of the brain presents the ultimate challenge to nearly all religions,” wrote Kenneth S. Kosik, a professor of neuroscience research at the University of California at Santa Barbara, in a letter to the journal Nature in January.
. . .
Dr. Kosik argues that the topic of the mind has the potential to cause much more conflict between scientists and the general public than does the issue of evolution. Many people of faith can easily accept the tenets of Darwinian evolution, but it is much harder for them to swallow the assumption of a mind that arises solely from the brain, he says. That issue he calls a “potential eruption.”

When researchers study the nature of consciousness, they find nothing that persuades them that the mind is anything but a product of the brain.

The reigning paradigm among researchers reduces every mental experience to the level of cross talk between neurons in our brains. From the perspective of mainstream science, the electrical and chemical communication among nerve cells gives rise to every thought, whether we are savoring a cup of coffee or contemplating the ineffable.
. . .
Mr. [Christof] Koch [a professor of cognitive and behavioral biology at the California Institute of Technology] collaborated for nearly two decades with the late Francis Crick, the co-discoverer of DNA’s structure, to produce a framework for understanding consciousness. The key, he says, is to look for the neural correlates of consciousness – the specific patterns of brain activity that correspond to particular conscious perceptions. Like Crick, Mr. Koch follows a strictly materialist paradigm that nerve interactions are responsible for mental states. In other words, he says, “no matter, never mind.”

Crick summed up the materialist theory in The Astonishing Hypothesis: The Scientific Search for the Soul (Scribner, 1994). He described that hypothesis as the idea that “your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.”

What many people may find ‘astonishing’ about Crick’s hypothesis is that among neurobiologists it is anything but astonishing. It is simply taken for granted as the way things are. Is it surprising that religious believers find such a conclusion unsettling?

Next: What does “free will” mean at a microscopic level?

POST SCRIPT: Why invading Iraq was morally and legally wrong

Jacob G. Hornberger, founder and president of The Future of Freedom Foundation has written a powerful essay that lays out very clearly the case of why the US invasion and occupation of Iraq is morally and legally indefensible, and why it has inevitably led to the atrocities that we are seeing there now, where reports are increasingly emerging of civilians being killed by US forces. Hornberger writes “I do know one thing: killing Iraqi children and other such “collateral damage” has long been acceptable and even “worth it” to U.S. officials as part of their long-time foreign policy toward Iraq.”

The article is well worth reading.

IDC gets on board the brain train

An article titled Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) examined what neuroscientists are discovering about religion and the brain. It is a curious article. The author (Richard Monastersky) seems to be trying very hard to find evidence in support of the idea that brain research is pointing to the independent existence of a soul/mind, but it is clear on reading it that he comes up short and that there is no such evidence, only the hopes of a very small minority of scientists.

He reports that what neuroscientists have been doing is studying what happens in the brain when religious people pray or meditate or think about god or have other similar experiences.

At the University of Pennsylvania, Andrew B. Newberg is trying to get at the heart – and mind – of spiritual experiences. Dr. Newberg, an assistant professor of radiology, has been putting nuns and Buddhist meditators into a scanning machine to measure how their brains function during spiritual experiences.

Many traditional forms of brain imaging require a subject to lay down in a claustrophobia-inducing tube inside an extremely loud scanner, a situation not conducive to meditation or prayer, says Dr. Newberg. So he used a method called single-photon-emission computed tomography, or Spect, which can measure how a brain acted prior to the scanning procedure. A radioactive tracer is injected into the subjects while they are meditating or praying, and the active regions of the brain absorb that tracer. Then the subjects enter the scanner, which detects where the tracer has settled.

His studies, although preliminary, suggest that separate areas of the brain became engaged during different forms of religious experience. But both the nuns and the meditators showed heightened activity in their frontal lobes, which are associated in other studies with focused attention.

The experiments cannot determine whether the subjects were actually in the presence of God, says Dr. Newberg. But they do reveal that religious experiences have a reality to the subjects. “There is a biological correlate to them, so there is something that is physiologically happening” in the brain, he says.

The finding that certain parts of the brain get activated during ‘spiritual experiences’ is not surprising. Neither is the fact that those experiences have a ‘reality to the subjects.’ All acts of consciousness, even total hallucinations, are believed to originate in the brain and leave a corresponding presence there, and why the researcher ever expected this to demonstrate evidence for god is not made clear in the article.

It is clear that intelligent design crationism (IDC) advocates are concerned about the implication of brain studies for religious beliefs. It seems plausible that as we learn more and more about how the brain works and about consciousness in general, the idea of a mind independent of the brain becomes harder to sustain. Hence IDC advocates are promoting meetings that highlight the work of those few researchers who think they see a role for god within the brain. But these meetings are being held in secret.

Organizers of the conference, called “Research and Progress on Intelligent Design,” had hoped to keep its existence out of public view. The university held a well-advertised public debate about ID that same week, but Michael N. Keas, a professor of history and the philosophy of science at Biola who coordinated the private meeting, would not confirm that it was happening when contacted by a reporter, nor would he discuss who was attending.

But one of the people doing this work is not shy about talking about his research.

When the leaders of the intelligent-design movement gathered for a secret conference this month in California, most of the talks focused on their standard concerns: biochemistry, evolution, and the origin of the universe. But they also heard from an ally in the neurosciences, who sees his own field as fertile ground for the future of ID.

Jeffrey M. Schwartz, a research professor of psychiatry at the University of California at Los Angeles, presented a paper titled “Intelligence Is an Irreducible Aspect of Nature” at the conference, held at Biola University, which describes itself as “a global center for Christian thought.” Dr. Schwartz argued that his studies of the mind provide support for the idea that consciousness exists in nature, separate from human brains.

Michael Behe, the author of Darwin’s Black Box which suggested five ‘irreducibly complex’ systems on which the IDC people have long hung their hopes for evidence of god, may be losing his status as the IDC movement’s scientific standard bearer. His book came out in 1996 and nothing new has been produced since then. It is clear that you cannot dine forever on that meager fare, especially since evolutionary biologists keep churning out new results all the time. The need for a new poster child is evident and it seems as if the IDC movement has found one in psychiatrist Schwartz.

Leaders of the intelligent-design movement, though, see clear potential for Dr. Schwartz’s message to resonate with the public.

“When I read Jeff’s work, I got in touch with him and encouraged him to become part of this ID community,” says William A. Dembski, who next month will become a research professor in philosophy at the Southwestern Baptist Theological Seminary, in Texas. “I regard him as a soul mate,” says Mr. Dembski.

This may be a sign that the real science-religion battle is shifting away from biological evolution to brain research. This new battle will not be as high profile as the evolution one simply because brain studies are not part of the school curriculum and thus not subject to the policies of local school boards. So the evolution battle will likely continue to dominate the news headlines for some time.

Tomorrow we will see what neurobiologists think of this attempt to find god in their area of study. If the IDC advocates thought that the biologists were a tough foe to convince, they are going to find that the brain research community is even more resistant to their overtures.

POST SCRIPT: War profiteers

One of the underreported stories of the Iraq invasion is the enormous amount of money that is being made by some people because of it. Coming in fall 2006 is a new documentary by Robert Greenwald titled Iraq for Sale: The War Profiteers.

Greenwald’s marketing strategy for his documentaries has been to bypass the main distribution networks and put his documentaries out straight to video for a low price. He did this with is earlier productions Outfoxed: Rupert Murdoch’s war on journalism (a look at the bias of Fox news), Uncovered: The war on Iraq (which exposed the fraudulent case made for the Iraq invasion), and Walmart: The high cost of low prices.

Look out for the release of Iraq for Sale. You can see the preview here.

Religion’s last stand-2: The role of Descartes

In the previous posting, I discussed two competing models of the mind/brain relationship.

It seems to me that the first model, where the physical brain is all there is and the mind is simply the creation of the brain, is the most persuasive one since it is the simplest and accepting it involves no further complications. In this model, our bodies are purely material things, with the brain’s workings enabling us to think, speak, reason, act, and so forth. The idea of ‘free will’ is an illusion due to the brain being an enormously complicated system whose processes and end results cannot be predicted. (A good analogy would be classically chaotic systems like the weather. Because of the specific non-linearity of the equations governing weather, we cannot predict long-term weather even though the system is a deterministic and materialistic.)

The second model, that of an independently existing non-material mind/soul, separate from the brain and directing the brain, immediately raises all kinds of problems, which have long been recognized. The scientist-philosopher Rene Descartes (1596-1650) of “I think, therefore I am” fame was perhaps the first person to formulate this mind-body dualism (or at least he is the person most closely associated with the idea) and it is clear that he felt that it was necessary to adopt this second model if one was to retain a belief in god.

But he realized immediately that it raises the problem of how the non-material mind/soul can interact with the material brain/body to get it to do things. Princess Elizabeth of Bohemia, with whom Descartes had an extended correspondence, was unable to understand Descartes’ explanation of this interaction and kept prodding him on this very question. Descartes had no adequate answer for her, even though both clearly wanted to believe in the existence of god and the soul. In the introduction to his translation of Descartes’ Meditations and other Metaphysical Writings (which contains extended segments of the Elizabeth-Descartes correspondence), Desmond Clarke writes:

After repeated attempts to answer the question, how is it possible for something which is not physical to interact with something else which, by definition is not physical?, Descartes concedes that he cannot explain how it is possible.

But he tried, using the best scientific knowledge available to him at that time. He argued that the location of the soul’s interaction with the body occurred in the pineal gland.

As is well known, Descartes chose the pineal gland because it appeared to him to be the only organ in the brain that was not bilaterally duplicated and because he believed, erroneously, that it was uniquely human. . . By localizing the soul’s contact with body in the pineal gland, Descartes had raised the question of the relationship of mind to the brain and nervous system. Yet at the same time, by drawing a radical ontological distinction between body as extended and mind as pure thought, Descartes, in search of certitude, had paradoxically created intellectual chaos.

Although Descartes failed in his efforts to convincingly demonstrate the independent existence of the soul, research into the relationship of religious beliefs to the central nervous system of the brain has continued.

Descartes is an interesting character. Much of his scientific work, and even his temperament, seem to indicate a materialistic outlook. But at the same time, he took great pains to try and find proofs of god’s existence. One gets the sense that he was a person trying to convince himself of something he did not quite believe in, and had he lived in a different time might have rejected god with some relief. The article on Descartes in Encyclopaedia Britannica Online, 13 June 2006 says:

Even during Descartes’s lifetime there were questions about whether he was a Catholic apologist, primarily concerned with supporting Christian doctrine, or an atheist, concerned only with protecting himself with pious sentiments while establishing a deterministic, mechanistic, and materialistic physics.

The article points to reasons for the ambiguousness of his views, which could be due to the fact that there was, at that time, considerable fear of the power of the Catholic Church and this may have guided the way he presented his work.

In 1633, just as he was about to publish The World (1664), Descartes learned that the Italian astronomer Galileo Galilei (1564–1642) had been condemned in Rome for publishing the view that the Earth revolves around the Sun. Because this Copernican position is central to his cosmology and physics, Descartes suppressed The World, hoping that eventually the church would retract its condemnation. Although Descartes feared the church, he also hoped that his physics would one day replace that of Aristotle in church doctrine and be taught in Catholic schools.

Descartes definitely comes across as somewhat less than pious, and non-traditional in his religious beliefs.

Descartes himself said that good sense is destroyed when one thinks too much of God. He once told a German protégée, Anna Maria van Schurman (1607–78), who was known as a painter and a poet, that she was wasting her intellect studying Hebrew and theology. He also was perfectly aware of – though he tried to conceal – the atheistic potential of his materialist physics and physiology. Descartes seemed indifferent to the emotional depths of religion. Whereas Pascal trembled when he looked into the infinite universe and perceived the puniness and misery of man, Descartes exulted in the power of human reason to understand the cosmos and to promote happiness, and he rejected the view that human beings are essentially miserable and sinful. He held that it is impertinent to pray to God to change things. Instead, when we cannot change the world, we must change ourselves.

Clearly he was not orthodox in his thinking. Although he tried to believe in god, it was his emphasis on applying the materialistic principles that he used in his scientific work to try and identify the mechanism by which the mind interacts with the brain that has the potential to create the big problem for religion.

To sum up Descartes’ argument, following sound scientific (methodological naturalistic) principles, he felt that if the mind interacted with the brain, then there had to be (1) some mechanism by which the non-material mind could influence the material brain, and (2) some place where this interaction took place. Although he could not satisfactorily answer the first question, he at least postulated a location for the interaction, the pineal gland. We know now that that is wrong, but the questions he raised are still valid and interesting ones that go to the heart of religion.

Next: What current researchers are finding about the brain and religion.

POST SCRIPT: Documentary on Rajini Rajasingham-Thiranagama

I have written before about the murder of my friend Rajini Rajasingham-Thiranagama, who had been an active and outspoken campaigner for human rights in Sri Lanka. I have learned that a documentary about her life called No More Tears Sister is the opening program in the 2006 PBS series P.O.V.

In the Cleveland area, the program is being shown on Friday, June 30, 2006 at 10:00pm on WVIZ 25. Airing dates vary by location, with some PBS stations showing it as early as June 27. The link above gives program listings for other cities. The synopsis on the website says:

If love is the first inspiration of a social revolutionary, as has sometimes been said, no one better exemplified that idea than Dr. Rajani Thiranagama. Love for her people and her newly independent nation, and empathy for the oppressed of Sri Lanka – including women and the poor – led her to risk her middle-class life to join the struggle for equality and justice for all. Love led her to marry across ethnic and class lines. In the face of a brutal government crackdown on her Tamil people, love led her to help the guerrilla Tamil Tigers, the only force seemingly able to defend the people. When she realized the Tigers were more a murderous gang than a revolutionary force, love led her to break with them, publicly and dangerously. Love then led her from a fulfilling professional life in exile back to her hometown of Jaffna and to civil war, during which her human-rights advocacy made her a target for everyone with a gun. She was killed on September 21, 1989 at the age of 35.

As beautifully portrayed in Canadian filmmaker Helene Klodawsky’s “No More Tears Sister,” kicking off the 19th season of public television’s P.O.V. series, Rajani Thiranagama’s life is emblematic of generations of postcolonial leftist revolutionaries whose hopes for a future that combined national sovereignty with progressive ideas of equality and justice have been dashed by civil war – often between religious and ethnic groups, and often between repressive governments and criminal rebel gangs. Speaking out for the first time in the 15 years since Rajani Thiranagama’s assassination, those who knew her best talk about the person she was and the sequence of events that led to her murder. Especially moving are the memories of Rajani’s older sister, Nirmala Rajasingam, with whom she shared a happy childhood, a political awakening and a lifelong dedication to fighting injustice; and her husband, Dayapala Thiranagama, who was everything a middle-class Tamil family might reject – a Sinhalese radical student from an impoverished rural background. Also included are the recollections of Rajani’s younger sisters, Vasuki and Sumathy; her parents; her daughters, Narmada and Sharika; and fellow human-rights activists who came out of hiding to tell her story. The film rounds out its portrayal with rare archival footage, personal photographs and re-enactments in which Rajani is portrayed by daughter Sharika Thiranagama. The film is narrated by Michael Ondaatje, esteemed author of The English Patient and Anil’s Ghost.

I knew Rajini well. We were active members of the Student Christian Movement in Sri Lanka when we were both undergraduates at the University of Colombo. It does not surprise me in the least that she threw herself with passion into the struggle for justice. She was brave and spoke the truth, even when it was unpalatable to those in power and with guns, and backed up her words with actions, thus putting her life on the line for her beliefs. Such people are rare. I am proud to have known her.

Religion’s last stand: The brain

As almost everyone is aware, the science-religion wars have focused largely on the opposition of some Christian groups to the teaching of evolution. The religious objections to Darwin’s theory of natural selection have been based on the fact that if the universe and the diversity of life that we see around us could have come about without the guidance of a conscious intelligence like god (even operating under the pseudonym of ‘intelligent designer’), then what need would we have for believing in a god?

But while evolution has been the main focus of attention, I see that as more of a preliminary skirmish to the real final battle battleground for religion, which involves the brain.

The crucial question for the sustaining of religious beliefs is the relationship of the mind to the brain. Is the mind purely a creature of the brain, and our thoughts and decisions merely the result of the neurons firing in our neuronal networks? If so, the mind is essentially a material thing. We may have ideas and thoughts and a sense of consciousness and free will that seem to be nonmaterial, but that is an illusion. All these things are purely the products of interactions of matter in our brains. In this model, the mind is entirely the product of the physical brain. This premise underlies the articles selected for the website MachinesLikeUs.com.

Or is the mind a separate (and non-material) entity, that exists independently of the brain and is indeed superior to it, since it is the agent that can cause the neurons in our brain to fire in certain ways and thus enable the brain to think and feel and make decisions? In this model, the ‘mind’ is who ‘I’ really am, and the material body ‘I’ possess is merely the vehicle through which ‘I’ am manifested. In this model, the mind is synonymous with the soul.

If we are to preserve the need for god, then it seems that one must adopt the second model, that human beings (at the very least among animals) are not merely machines operating according to physical laws. We need to possess minds that enable us to think and make decisions and tell our bodies how to act. Most importantly, our minds are supposed to have the capacity of free-will. After all, what would be the value of an act of ‘faith’ if the mind were purely driven by mechanical forces in the brain?

It should be immediately obvious why the nature of the mind is a far more disturbing question for religion than evolution is or ever will be. With evolution, the question centers around whether the mechanism of natural selection (and its corollary principles) is sufficient to explain the diversity of life and changes over time. As such, the debate boils down to the question of weighing the evidence for and against and determining whether which is more plausible.

But plausibility lies in the eye of the beholder and we have seen in a previous posting how the desire to preserve beliefs one holds dear leads people to adopt intellectual strategies that enable them to do so.

Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) says that the strategies adopted are: “1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender.”

In the discussions about evolution, people who wish to preserve a role for god have plenty of viable options at their disposal. They can point to features that seem to have a low probability of occurring without the intervention of an external, willful, and intelligent guidance (aka god). These are the so-called ‘irreducibly complex’ systems touted by intelligent design creationism (IDC) advocates. Or they can point to the seeming absence of transitional fossils between species. Or they can point to seemingly miraculous events or spiritual experiences in their lives.

Scientists argue that none of these arguments are valid, that plausible naturalistic explanations exist for all these things, and that the overwhelming evidence supports evolution by natural selection as sufficient to explain things, without any need for any supernatural being.

But in one sense, that argument misses the point. As long as the debate is centered on weighing the merits of competing evidence and arriving at a judgment, van Gelder’s point is that it does not matter if the balance of evidence tilts overwhelmingly to one side. People who strongly want to believe in something will take the existence of even the slenderest evidence as sufficient for them. And it seems likely that the evolution debate, seeing as it involves complex systems and long and subtle chains of inferential arguments, will always provide some room to enable believers to retain their beliefs.

But the mind/brain debate is far more dangerous for religion because it involves the weighing of the plausibility of competing concepts, not of evidence. The fundamental question is quite simple and easily understood: Is the brain all there is and the mind subordinate to it, a product of its workings? Or is the mind an independently existing entity with the brain subordinate to it?

This is not a question that scientific data and evidence has much hope of answering in the near future. Eliminating the mind as an independently existing entity has all the problems associated with proving a negative, and is similar to trying to prove that god does not exist.

But since the mind, unlike god, is identified with each individual and is not necessarily directly linked to god, discussing its nature carries with it less religious baggage, and its nature can be examined more clinically

Next: Descartes gets the ball rolling on the mind and the brain.

POST SCRIPT: Choosing god

I came across this story (thanks to onegoodmove) that illustrates the point that I was trying to make on the way people choose what kind of god to believe in. I have no idea if the events actually occurred, though, or if the story has been embellished to make the point.

The subject was philosophy. Nietzsche, a philosopher well known for his dislike of Christianity and famous for his statement that ‘god is dead’, was the topic. Professor Hagen was lecturing and outside a thunderstorm was raging. It was a good one. Flashes of lightning were followed closely by ominous claps of thunder. Every time the professor would describe one of Nietzsche’s anti-Christian views the thunder seemingly echoed his remarks.

At the high point of the lecture a bolt of lightning struck the ground near the classroom followed by a deafening clap of thunder. The professor, non-plussed, walked to the window, opened it, and starting jabbing at the sky with his umbrella. He yelled, “You senile son of a bitch, your aim is getting worse!”

Suffice it to say that some students were offended by his irreverent remark and brought it to the attention of the Department Head. The Department Head in turn took it to the Dean of Humanities who called the professor in for a meeting. The Dean reminded the professor that the students pay a lot of tuition and that he shouldn’t unnecessarily insult their beliefs.

“Oh,” says the professor, “and what beliefs are those?”

“Well, you know” the Dean says, “most students attending this University are Christians. We can’t have you blaspheming during class.”

“Surely” says the professor, “the merciful God of Christianity wouldn’t throw lightning bolts. It’s Zeus who throws lightning bolts.”

Later the Dean spoke with the Department Head, and said, “The next time you have a problem with that professor, you handle it, and let him make an ass out of you instead.”