Thoughts on the book Soul of a Chef

(Here are my remarks to the class of incoming first year students at Case’s Share the Vision program held in Severance Hall which featured the common reading book Soul of a Chef by Michael Ruhlman.)

They say that two things in life are inevitable – death and taxes. To this, you have to add a third and that is that at you will have to serve on many committees. Most committees, even in universities, tend to be routine and boring affairs but one of the best committees that I have served on for the past two years is that which selects the common reading for the incoming class and which this year selected the book Soul of a Chef. The reason that I enjoy this particular committee assignment is that I love books and reading, and this committee brings together students and staff and faculty who share that interest to talk about books and ideas. This exercise is what lies at the heart of a university. So you never have to twist my arm to get me to serve on this committee.

Having said all that, I must say that when this book was first selected, I had some personal misgivings about it. Let me explain why. How the selection process works is that any member of the university community is welcome to nominate books, so we get a huge number of nominations. Of those, some are immediately eliminated for various practical reasons that I won’t go into but that still leaves a lot of books remaining. Of course each person on the committee cannot all read all the books that make the final cut, so each person selects a few books to read and reports back to the committee on their merits. We then compare notes, whittle down the list even more, and then make the final selection.

I did not select Soul of a Chef as one of the books that I would personally read. It deals with material that is of no real interest to me. Food to me is largely just a means of sustenance and little more. The world of high cuisine is not my world. In fact, I have never ever even eaten in a fancy restaurant and have no real desire to do so, except as a curiosity, and only if somebody else is paying the expensive bill. I rarely ever cook and when I do no one else wants to eat what I make. I have no ambitions of rising to the level of being even a mediocre cook. So the book basically dealt with a world that was completely foreign to me and which I had no real desire to enter.

But I went along with the choice because the students on the committee were very enthusiastic about it and I respected their judgment. But now I had to read the book. How do you set about reading a book that one is unenthusiastic about? When I flipped open to the very first page, there were already three new words, cooking terms that I had never heard before in my life, which was also kind of discouraging.

It then occurred to me that the situation I was in was a reversal of the typical teacher-student roles. Usually, it is the teacher who selects a book and is really enthusiastic about it, while students are completely baffled as to what is so great about it, groan at the choice, and wonder how on Earth they are going to work their way through it. Those of you who had to read Moby Dick for high school English know exactly what I am talking about. The teacher excitedly announces that you are going to read the greatest novel in American literature and then hands out what initially seems to you like 500 pages of very small type of a textbook dealing with the whaling industry, a subject about which you never had the slightest interest.

So I told myself to follow the suggestions that I give my students when I assign a book for them to read and know that they may not be as enthusiastic as I. Rather than simply read the book and absorb all of it, I tell them to read it with an attitude, with the following four questions in mind, and to focus on those parts of the book that provide answers to them. The four questions are:

1. What is the author trying to convince me of?
2. What is the author assuming that I already think about the topic?
3. Was the author successful in getting me to change my mind?
4. Does the book provide any insights to things that I care more about?

Reading a book with this kind of attitude makes it much more enjoyable because then you are effectively engaging in a dialogue with the author, and sure enough I became very engrossed in the book that I had been initially hesitant to read. It also helped that it is a far easier read than Moby Dick.

So here are my answers to those four questions.

1. What is the author trying to convince me of?

It seemed to me that the author was trying to make the case that being a chef was very demanding and requires one to be very tough, both physically and mentally. There was a macho, even sexist, strain to being a chef that was revealed in the book, especially the first part where the cooks were taking the certified master chef exam. Recall the sole woman who was tearfully eliminated early on. Basically the author was implying that it takes a ‘real man’ to be a chef.

2. What is the author assuming that I already think about the topic?

It seemed to me that the author was assuming that the reader thought cooking was a pretty wimpy activity, a hobby, a pastime, something that anybody could do by just picking up a cookbook and following a recipe.

I will address question 3 after I discuss question 4 about the relevance to things I care about.

4. There were other important educational lessons from the book that relate closely to the academic experience you will have here at Case:

1. Many times students complain that what they learn in class is not related to “real” life. You would think that training to be a chef would not be like this, that it would involve only making real dishes that people eat. But in the certification exercises, I found it interesting that the training of chefs involved having students master highly contrived dishes that they would never actually make as chefs, but which were meant only to develop specific skills that would come in useful in actual cooking situations.

You will find the same thing in college. If you take an introductory physics course, you will study the behavior of blocks sliding down inclined planes and do a lot of problems about them. Here is a secret. No physicist really cares about blocks sliding down planes. The only reason we ask students to study this type of problem is because it is a very good method of learning important basic physics principles that can be applied in real situations.

Often you need to learn things that are artificial and contrived because they highlight important basics that you can then use for real-life complex problems. Many of the things you will do as students may seem arbitrary (just like the timed tests and the pressure that is put on the chefs) but they have a deeper purpose that may not be apparent at first.

I admit that this can be irritating. Following strict rules can seem tiresome especially when you see experts breaking the very rules that they tell you to follow. You too may want to rebel and break them. But great chefs break the rules only after learning all the rules, because only then do they know what rules to break and when, and what the consequences are. Michael Symon is quoted as an example of someone who knows how to break cooking rules to good ends. Professional physicists are also like that.

2. What may seem trivial or irrelevant to a student can, to the expert, be an important sign of understanding. This was the case of the student crying after failing buffet on page 59. To us, a buffet may seem trivial but not being able to handle it was considered a big deal by the examiners. Things that seem like petty details can contain deep subtleties.

3. Sometime students think that the only kinds of objective judgments that one can make are those to numerical problems on multiple-choice tests. Assessments of essays are thought to be subjective and thus inferior. But the reality is that all assessments are judgments and that expert professionals in the field can often make precise and consistent assessments of things that we might think are purely subjective and opinion. You might think that whether a dish is good or not is largely opinion, just like whether a painting is good or not. But experts in those fields can make surprisingly precise and consistent judgments. For example, Brian gets scores of 62.82 on classical cuisine and 62.55 on mystery basket (p. 115). OK, going to the second decimal point is a bit over the top, but the fact remains that the examiners had little difficulty is agreeing as to the quality of the dishes and rating it on a 100-point scale. The main difference in judgments between cooking and something that appears more objective like physics is that in physics, the judgments that need to be made are buried more deeply and not as easily visible to students until they get to the more advanced levels. But they are still there.

4. When teachers set high standards, it is usually meant to challenge students to reach excellence, not to cause them to fail. Teachers in college are sad when their students fail to do well, just like the examiners were sad when the chefs dropped out at various stages of the exam. Very, very few teachers delight in deliberately failing students and such people do not belong in the teaching profession. Most teachers want their students to succeed and delight when they do so, but at the same time want to ensure that students are challenged so that they grow.

5. The final insight that I got is that the key to success in any thing in life is discovering some aspect of the task that you want to do really well and using that as a gateway to other things. In the case of Thomas Heller, it started with his obsession with making a perfect Hollandaise sauce (p. 266). In repeatedly trying to perfect it, he realized that he wanted to be a chef and used that as his entry point.

Of course, you may not agree with me on any of these answers. That is the beauty of books. They do not have a unique meaning, even to the author. A writer of novels tells of how his book was assigned as a high school text and as a result he would occasionally get phone calls from students who had tracked him down. The students would say that their teacher wanted them to write about what a particular passage means and they thought that the author would know the ‘real’ answer. He tells them that he does not know what it means any better than they do.

All knowledge is obtained by taking the words that are ‘out there’ in books and other sources and combining them with our own life experiences to construct our own meanings. This is why the discussions that you have in seminars and with your friends and companions at other times is so important to learning, because that is how we best figure what we believe and what books are saying to us. If your experience at Case ends up as a four-year long in-depth conversation about ideas with other students and faculty, then you have got a real education.

For the third question, was the author successful in convincing me to change my mind? All I want to say is that while reading the book, especially the first part dealing with the grueling certification exam, Stanley Kubrick’s film Full Metal Jacket kept coming to my mind. The first half of that film dealt with the brutal and grueling training that new recruits to the marines undergo.

So I guess the author did manage to persuade me that being a chef required real toughness.

The language of science

Good scientists write carefully but not defensively. By carefully, I mean that they strive to be clear and to not over-reach, i.e., reach conclusions beyond what is warranted by the evidence. But they are not overly concerned with whether their words will be taken out of context and misused or subject to other forms of manipulation. It is an unwritten rule of scientific discourse that you do not score cheap debating points. Scientists are expected to respect those who oppose them and deal with the substance of their arguments and not indulge in superficial word games.
[Read more…]

Why “balanced coverage” does not always lead to good science journalism

In a previous post, I showed how George Monbiot of the Guardian newspaper provided an example of good science reporting, distinguishing the credible from those who indulge in wishful thinking. But unfortunately, he is an exception. And Chris Mooney writing in 2004 in the Columbia Journalism Review describes how the more common journalistic practice of attempting to provide “balanced coverage” of a scientific issue tends to allow the scientific fringe elements to distort reality.

As the Union of Concerned Scientists, an alliance of citizens and scientists, and other critics have noted, Bush administration statements and actions have often given privileged status to a fringe scientific view over a well-documented, extremely robust mainstream conclusion. Journalists have thus had to decide whether to report on a he said/she said battle between scientists and the White House — which has had very few scientific defenders — or get to the bottom of each case of alleged distortion and report on who’s actually right.
. . .
Energy interests wishing to stave off action to reduce greenhouse gas emissions have a documented history of supporting the small group of scientists who question the human role in causing climate change — as well as consciously strategizing about how to sow confusion on the issue and sway journalists.

In 1998, for instance, John H. Cushman, Jr., of The New York Times exposed an internal American Petroleum Institute memo outlining a strategy to invest millions to “maximize the impact of scientific views consistent with ours with Congress, the media and other key audiences.” Perhaps most startling, the memo cited a need to “recruit and train” scientists “who do not have a long history of visibility and/or participation in the climate change debate” to participate in media outreach and counter the mainstream scientific view. This seems to signal an awareness that after a while, journalists catch on to the connections between contrarian scientists and industry.
. . .
In a recent paper published in the journal Global Environmental Change, the scholars Maxwell T. Boykoff and Jules M. Boykoff analyzed coverage of the issue in The New York Times, The Washington Post, The Wall Street Journal, and the Los Angeles Times between 1988 and 2002. During this fourteen-year period, climate scientists successfully forged a powerful consensus on human-caused climate change. But reporting in these four major papers did not at all reflect this consensus.

The Boykoffs analyzed a random sample of 636 articles. They found that a majority — 52.7 percent — gave “roughly equal attention” to the scientific consensus view that humans contribute to climate change and to the energy-industry-supported view that natural fluctuations suffice to explain the observed warming. By comparison, just 35.3 percent of articles emphasized the scientific consensus view while still presenting the other side in a subordinate fashion. Finally, 6.2 percent emphasized the industry-supported view, and a mere 5.9 percent focused on the consensus view without bothering to provide the industry/skeptic counterpoint.

Most intriguing, the Boykoffs’ study found a shift in coverage between 1988 — when climate change first garnered wide media coverage — and 1990. During that period, journalists broadly moved from focusing on scientists’ views of climate change to providing “balanced” accounts. During this same period, the Boykoffs noted, climate change became highly politicized and a “small group of influential spokespeople and scientists emerged in the news” to question the mainstream view that industrial emissions are warming the planet. The authors conclude that the U.S. “prestige-press” has produced “informationally biased coverage of global warming . . . hidden behind the veil of journalistic balance.”
. . .
Some major op-ed pages also appear to think that to fulfill their duty of providing a range of views, they should publish dubious contrarian opinion pieces on climate change even when those pieces are written by nonscientists. For instance, on July 7, 2003, The Washington Post published a revisionist op-ed on climate science by James Schlesinger, a former secretary of both energy and defense, and a former director of Central Intelligence. “In recent years the inclination has been to attribute the warming we have lately experienced to a single dominant cause — the increase in greenhouse gases,” wrote Schlesinger. “Yet climate has always been changing — and sometimes the swings have been rapid.” The clear implication was that scientists don’t know enough about the causes of climate change to justify strong pollution controls.

That’s not how most climatologists feel, but then Schlesinger is an economist by training, not a climatologist. Moreover, his Washington Post byline failed to note that he sits on the board of directors of Peabody Energy, the largest coal company in the world, and has since 2001.

Eldan Goldenberg, who has long been concerned with the way science is reported, kindly sent to me a report put out by the Stratfor group about a conference of journalists and scientists convened last month to discuss this very issue. Some excerpts:

Panels of journalists and scientists gathered July 25 at the Woodrow Wilson International Center for Scholars in Washington to discuss the mainstream media’s reporting on climate change. The consensus was that the media have not covered the issue well.

According to both panels, the greatest shortcoming has been in persistent portrayals of the issue as one of contentious scientific debate: In reality, the assembled scientists said, man-made climate change is generally accepted throughout the scientific community as a reality.

Most of the time at the conference was dedicated to examining the media’s portrayal of the issue and explaining how it came into being. The root of the problem, most participants agreed, is that climate change has been covered primarily as a political rather than a scientific issue — and thus, the media have focused on the political debate rather than the science behind it.

In the background of this discussion loomed a larger issue: The mainstream media, recognizing that there is more to the story, now are struggling with ways to change their portrayal of the climate change issue. Arguments are emerging that the scientific debate ha now been concluded, “industry” has lost and the new debate is about policy options. Though this line of thinking is nearer to the truth, it does not entirely close the gap. The fact is that industry all but stopped contesting the premise of man-made climate change two years ago, but the media’s preoccupation with the traditional battle lines — industry versus environmentalists — continues to obscure the complexity of the issue and the positions of various players.
. . .
Because the media continue to write about these matters as political issues — debates between two interested parties – the scientific questions at the center of campaigns on climate change, the relative risk of various chemicals and substances and the risks posed by genetically modified organisms have been relegated to the backburner. Rather than being the focus in the policy debates, the science is used as a tactic in a communications and public relations battle.

The proposed solution to this problem is that journalists should eschew the goal of “balanced coverage” when it comes to science. This, I believe, is unworkable in practice because it would be singling out science for different kind of treatment than other topics. Journalists are generalists, sometimes doing science, sometimes shifted to other beats. It is unreasonable to expect them to radically shift their mode of operation depending on the topic.

In fact, I believe that this problem is not limited to global warming or to scientific issues generally. Instead, I feel that this idea of “balanced coverage,” that has become the journalistic ideal in the US, produces lower quality of journalism in general.

But that is a topic for another day.

How science reporters should do their job

About a year ago, Eldan Goldenberg had a post complaining about the lousy job that reporters do when covering science. (They do an even worse job when covering the government’s fraudulent case for going to war, but that’s a post for another day.)

The way that they cover global warming is a good example of the problem. But before we get to the bad news, let’s first look at how a good science reporter should do the job, and for this there is an excellent example in George Monbiot of the London Guardian newspaper.
[Read more…]

Taking steps to avoid global warming

One of the curious features of the debate over what should be done about global warming is what we should be done about it. I can actually understand the position of those who are skeptical about whether things like the Kyoto treaty will solve the problem. I can understand those who worry that government regulations might not work.

What puzzles me are those people who somehow see the actions taken to reduce the production of greenhouse gases as some sort of affront that has to be opposed.
[Read more…]

Should secularists fight for 100% separation of church and state?

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006.)

Like most atheists, it really is of no concern to me what other people believe. If you do not believe in a god or heaven and hell in any form, then the question of what other people believe about god is as of little concern to you as questions about which sports teams they root for or what cars they drive.

If you are a follower of a theistic religion, however, you cannot help but feel part of a struggle against evil, and often that evil is personified as Satan, and non-believers or believers of other faiths can be seen as followers of that evil. Organized religions also need members to survive, to keep the institution going. So for members of organized religion, there is often a mandate to try and get other people to also believe, and thus we have revivals and evangelical outreach efforts and proselytizing.

But atheists have no organization to support and keep alive with membership dues. We have no special book or building or tradition to uphold and maintain. You will never find atheists going from door to door spreading the lack of the Word.

This raises an interesting question. Should atheists be concerned about religious symbolism in the public sphere such as placing nativity scenes on government property at Christmas or placing tablets of the Ten Commandments in courthouses, both of which have been the subjects of heated legal struggles involving interpretations of the First Amendment to the constitution? If those symbols mean nothing to us, why should we care where they appear?

In a purely intellectual sense, the answer is that atheists (and other secularists) should not care. Since for the atheist the nativity scene has as little meaning as any other barnyard scene, and the Ten Commandments have as much moral force as (say) any of Dave Letterman’s top ten lists, why should these things bother us? Perhaps we should just let these things go and avoid all the nasty legal fights.

Some people have advocated just this approach. Rather than fighting for 100% separation of church and state, they suggest that we should compromise on some matters. That way we can avoid the divisiveness of legal battles and also prevent the portrayal of atheists as mean-spirited people who are trying to obstruct other people from showing their devotion to their religion. If we had (say) 90% separation of church and state, wouldn’t that be worth it in order to stop the acrimony? Bloggers Matthew Yglesias and Kevin Drum present arguments in favor of this view, and it does have a certain appeal, especially for people who prefer to avoid confrontations and have a live-and-let-live philosophy.

But this approach rests on a critical assumption that has not been tested and is very likely to be false. This assumption is that the religious community that is pushing for the inclusion of religious symbolism in the public sphere has a limited set of goals (like the items given above) and that they will stop pushing once they have achieved them. This may also be the assumption of those members of non-Christian religions in the US who wish to have cordial relations with Christians and thus end up siding with them on the religious symbolism question.

But there is good reason to believe that the people who are pushing most hard for the inclusion of religious symbolism actually want a lot more than a few tokens of Christian presence in the public sphere. They actually want a country that is run on “Christian” principles (for the reason for the quote marks, see here.) For them, a breach in the establishment clause of the first amendment for seemingly harmless symbolism is just the overture to a movement to eventually have their version of religion completely integrated with public and civic life. (This is similar to the “wedge strategy” using so-called intelligent design (ID). ID advocates see the inclusion of ID (with its lack of an explicit mention of god) in the science curriculum as the first stage in replacing evolution altogether and bringing god back into the schools.)

Digby, the author of the blog Hullabaloo argues that although he also does not really care about the ten commandments and so on, he thinks that the compromise strategy is a bad idea. He gives excellent counter-arguments and also provides some good links on this topic. Check out both sides. Although temperamentally my sympathies are with Yglesias and Drum, I think Digby wins the debate.

So the idea of peaceful coexistence on the religious symbolism issue, much as it appeals to people who don’t enjoy the acrimony that comes with conflicts over principle, may be simply unworkable in practice.

The journey to atheism

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006.)

In a comment to a previous post, Jim Eastman said something that struck me as very profound. He said:

It’s also interesting to note that most theists are also in the game of declaring nonexistence of deities, just not their own. This quote has been sitting in my quote file for some time, and it seems appropriate to unearth it.

“I contend we are both atheists – I just believe in one fewer god than you do. When you understand why you reject all other gods, you will understand why I reject yours as well.” – Stephen F. Roberts

This quote captures accurately an important stage in my own transition from belief to atheism. Since I grew up as a Christian in a multi-religious society and had Hindu, Muslim, and Buddhist friends, I had to confront the question of how to deal with other religions. My answer at that time was simple – Christianity was right and the others were wrong. Of course, since the Methodist Church I belonged to had an inclusive, open, and liberal theological outlook, I did not equate this distinction with good or evil or even heaven and hell. I felt that as long as people were good and decent, they were somehow all saved, irrespective of what they believed. But there was no question in my mind that Christians had the inside track on salvation and that others were at best slightly misguided.

But as I got older and reached middle age, I found the question posed by Roberts increasingly hard to answer. It became clear to me that when I said I was a Christian, this was not merely a statement of what I believed. Implicitly I was also saying, in effect if not in words, that I was not a Hindu, Muslim, Jew, Buddhist, etc. As in the quote above, I could not satisfactorily explain to myself the basis on which I was rejecting those religions. After all, like most people, I believed in my own religion simply because I had grown up in that tradition. I had little or no knowledge of other religions and hence had no grounds for rejecting them. In the absence of a convincing reason for rejection, I decided to just remove myself from any affiliation whatsoever, and started to consider myself a believer in a god that was not bound by any specific religious tradition.

But when one is just a free-floating believer in god, without any connection to organized religion and the comforting reinforcement that comes with regular worship with others, one starts asking difficult questions about the nature of god and the relationship to humans for which the answers provided by organized religious dogma simply do not satisfy. When one is part of a church or other religious structure one struggles with difficult questions (suffering, the virgin birth, the nature of the Trinity, original sin, the basis for salvation, etc.) but those difficulties are addressed within a paradigm that assumes the existence of god, and thus always provides, as a last option, saying that the ways of god are enigmatic and beyond the comprehension of mere mortals.

But when I left the church, I started struggling with different questions such as why I believed that god existed at all. And if she/he/it did exist, how and where and in what form did that existence take, and what precisely was the nature of the interaction with humans?

I found it increasingly hard to come up with satisfactory answers to these questions and I remember the day when I decided that I would simply jettison the belief in god altogether. Suddenly everything seemed simple and clear. It is possible that I had arrived at this conclusion even earlier but that my conscious mind was rejecting it until I was ready to acknowledge it. It is hard, after all, to give up a belief that has been the underpinning of one’s personal philosophy. But the feeling of relief that accompanied my acceptance of non-belief was almost palpable and unmistakable, making me realize that my beliefs had probably been of a pro forma sort for some time.

Especially liberating to me was the realization that I did not have to examine all new discoveries of science to see if they were compatible with my religious beliefs. I could now go freely wherever new knowledge led me without wondering if it was counter to some religious doctrine.

A childhood friend of mine who knew me during my church-religious phase was surprised by my change and reminded me of two mutual friends who, again in middle age, had made the transition in the opposite direction, from atheism to belief. He asked me if it was possible that I might switch again.

It is an interesting question to which I, of course, cannot know the answer. My personal philosophy satisfies me now but who can predict the future? But while conversions from atheism to belief and vice versa are not uncommon, I am not sure how common it is for a single person to make two such U-turns and end up close to where they started. It seems like it would be a very unlikely occurrence. I don’t personally know of anybody who did such a thing.

Agnostic or atheist?

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006.)

I am sure that some of you have noticed that you get a more negative response to saying you are an atheist than to saying that you are an agnostic. For example, in a comment to a previous posting, Erin spoke about finding it “weird that atheism is so counter-culture. Looking back at my youth, announcing your non-belief in God was a surefire shock tactic.” But while I have noticed that people are shocked when someone says that he/she is an atheist, they are a lot more comfortable with you saying that you are an agnostic. As a result some people might call themselves agnostics just to avoid the raised eyebrows that come with being seen as an atheist, lending support to the snide comment that “an agnostic is a cowardly atheist.”

I have often wondered why agnosticism produces such a milder reaction. Partly the answer is public perceptions. Atheism, at least in the US, is associated with people who very visibly and publicly challenge the role of god in the public sphere. When Michael Newdow challenged the legality of the inclusion of “under God” in the Pledge of Allegiance that his daughter had to say in school, the media focused on his atheism as the driving force, though there are religious people who also do not like this kind of encroachment of religion into the public sphere.

In former times, atheism was identified with the flamboyant and abrasive Madalyn Murray O’Hair whose legal action led in 1963 to the US Supreme Court voting 8-1 to ban “‘coercive’ public prayer and Bible-reading at public schools.” (In 1964 Life magazine referred to her as the most hated woman in America.) I discussed earlier that the current so-called intelligent design (ID) movement in its “Wedge” document sees this action as the beginning of the moral decline of America and is trying to reverse that course by using ID as a wedge to infiltrate god back into the public schools. Since O’Hair also founded the organization American Atheists, some people speculate that the negative views that Americans have of atheism is because of the movement’s close identification with her.

I think that it may also be that religious people view atheism as a direct challenge to their beliefs, since they think atheism means that you believe that there definitely is no god and that hence they must be wrong. Whereas they think agnostics keep an open mind about the possible existence of god, so you are accepting that they might be right.

The distinction between atheism and agnosticism is a bit ambiguous. For example, if we go to the Oxford English Dictionary, the words are defined as follows:

Atheist: One who denies or disbelieves the existence of a God.

Agnostic: One who holds that the existence of anything beyond and behind material phenomena is unknown and (so far as can be judged) unknowable, and especially that a First Cause and an unseen world are subjects of which we know nothing.

The definition of atheism seems to me to be too hard and creates some problems. Denying the existence of god seems to me to be unsustainable. I do not know how anyone can reasonably claim that there definitely is no god, simply because of the logical difficulty of proving a negative. It is like claiming that there is no such thing as an extra-terrestrial being. How can one know such a thing for sure?

The definition of agnosticism, on the other hand, seems to me to be too soft, as if it grants the existence of god in some form, but says we cannot know anything about she/he/it.

To me the statement that makes a good starting point is the phrase attributed to the scientist-mathematician Laplace in a possibly apocryphal story. When he presented his book called the System of the World, Napoleon is said to have noted that god did not appear in it, to which Laplace is supposed to have replied that “I have no need for that hypothesis.”

If you hold an expanded Laplacian view that you have no need for a god to provide meaning or explanations and that the existence of god is so implausible as to be not worth considering as a possibility, what label can be put on you, assuming that a label is necessary? It seems like this position puts people somewhere between the Oxford Dictionary definitions of atheist and agnostic. But until we have a new word, I think that the word atheist is closer than agnostic and we will have to live with the surprise and dismay that it provokes.

Shafars and brights arise!

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006)

Sam Smith runs an interesting website called the Progressive Review. It is an idiosyncratic mix of political news and commentary with oddball, amusing, and quirky items culled from various sources thrown in. Mixed with these are his own thoughtful essays on various topics and one essay that is relevant to this series of posts on religion and politics is his call for “shafars” (an acronym he has coined that stands for people who identify with secularism, humanism, atheism, free thought, agnosticism, or rationalism) to play a more visible and assertive role in public life and to not let the overtly religious dominate the public sphere.
[Read more…]

“I know this is not politically correct but….”

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006)

One of the advantages of being older is that sometimes you can personally witness how language evolves and changes, and how words and phrases undergo changes and sometimes outright reversals of meaning.

One of the interesting evolutions is that of the phrase “politically correct.” It was originally used as a kind of scornful in-joke within Marxist political groups to sneer at those members who seemed to have an excessive concern with political orthodoxy and who seemed to be more concerned with vocabulary than with the substance of arguments and actions.

But later it became used against those who were trying to use language as a vehicle for social change by making it more nuanced and inclusive and less hurtful, judgmental, or discriminatory. Such people advocated using “disabled” instead of “crippled” or “mentally ill” instead of “crazy,” or “hearing impaired” instead of “deaf” and so on in an effort to remove the stigma under which those groups had traditionally suffered. Those who felt such efforts had been carried to an extreme disparaged those efforts as trying to be “politically correct.”

The most recent development has been to shift the emphasis from sneering at the careful choosing of words to sneering at the ideas and sentiments behind those words. The phrase has started being used pre-emptively, to shield people from the negative repercussions of stating views that otherwise may be offensive or antiquated. This usage usually begins by saying “I know this is not politically correct but….” and then finishes up by making a statement that would normally provoke quick opposition. So you can now find people saying “I know this is not politically correct but perhaps women are inferior to men at mathematics and science” or “I know this is not politically correct but perhaps poor people are poor because they have less natural abilities” or “I know this is not politically correct but perhaps blacks are less capable than whites at academics.” The opening preamble is not only designed to make such statements acceptable, the speaker can even claim the mantle of being daring and brave, an outspoken and even heroic bearer of unpopular or unpalatable truths.

Sentiments that would normally would be considered discriminatory, biased, and outright offensive if uttered without any supporting evidence are protected from criticism by this preamble. It is then the person who challenges this view who is put on the defensive, as if he or she was some prig who unthinkingly spouts an orthodox view.

As Fintan O’Toole of The Irish Times (May 5, 1994) noted this trend early and pithily said:

We have now reached the point where every goon with a grievance, every bitter bigot, merely has to place the prefix, “I know this is not politically correct but…..'” in front of the usual string of insults in order to be not just safe from criticism but actually a card, a lad, even a hero. Conversely, to talk about poverty and inequality, to draw attention to the reality that discrimination and injustice are still facts of life, is to commit the new sin of political correctness……… Anti-PC has become the latest cover for creeps. It is a godsend for every sort of curmudgeon or crank, from the fascistic to the merely smug.

Hate blacks? Attack positive discrimination – everyone will know the codes. Want to keep Europe white? Attack multiculturalism. Fed up with the girlies making noise? Tired of listening to whining about unemployment when your personal economy is booming? Haul out political correctness and you don’t even have to say what’s on your mind.

Even marketers are cashing in on this anti-PC fad, as illustrated by this cartoon.

Perhaps it is my physics training, but I tend to work from the principle that in the absence of evidence to the contrary, we should assume that things are equal. For example, physicists assume that all electrons are identical. We don’t really know this for a fact, since it is impossible to compare all electrons. The statement “all electrons are identical” is a kind of default position and, in the absence of evidence to the contrary, does not need to be supported by positive evidence.

But the statement “some electrons are heavier than others” is going counter to the default position and definitely needs supporting evidence to be taken seriously. Saying “I know this is not politically correct but I think some electrons are heavier than others” does not make it any more credible.

The same should hold for statements that deal with people, because I would like to think that the default position is that people are (on average) pretty much the same in their basic needs, desires, feelings, hopes, and dreams.