Evolutionary theory and falsificationism

In response to a previous posting, commenter Sarah Taylor made several important points. She clearly articulated the view that evolutionary theory is a complex edifice that is built on many observations that fit into a general pattern that is largely chronologically consistent.

She also notes that one distinguishing feature of science is that there are no questions that it shirks from, that there are no beliefs that it is not willing to put to the test. She says that “What makes scientific theories different from other human proposals about the nature of the universe are their courage. They proclaim their vulnerabilities as their strengths, inviting attack.�

I would mostly agree with this. Science does not shy away from probing its weaknesses, although I would not go so far as to claim that the vulnerabilities are seen as strengths. What is true is that the ‘weaknesses’ of theories are not ignored or covered up but are seen as opportunities for further research. Since there is no such thing in science as infallible knowledge, there is no inherent desire to preserve any theory at all costs, and the history of science is full of once dominant theories that are no longer considered credible.

But having said all that, it is not necessarily true that finding just one contradiction with a theory is sufficient to overthrow the theory. In the context of the challenge to Darwinian theory by intelligent design (ID) advocates, Sarah’s statement that “All that any ID devotee has to do is to show ONE fossil ‘out of place’, to prove the theory doesn’t work. Just one horse shoulder blade in a Cambrian deposit somewhere in the world, and we can say goodbye to Darwinâ€? is a little too strong.

Sarah’s view seems to be derived from the model of falsificationism developed by the philosopher of science Karl Popper (see his book Conjectures and Refutations: The Growth of Scientific Knowledge), 1963) who was trying to explain how science progresses. After showing that trying to prove theories to be true was not possible, Popper argued that what scientists should instead do is try to prove theories false by finding a single counter-instance to the theory’s predictions. If that happens, the theory is falsified and has to be rejected and replaced by a better one. Hence the only status of a scientific theory is either ‘false’ or ‘not yet shown to be false.’

But historians of science have shown that this model, although appealing to our sense of scientific bravado, does not describe how science actually works. Scientists are loath to throw away perfectly productive theories on the basis of a few anomalies. If they did so, then no non-trivial theory would survive. For example, the motion of the perigee of the moon’s orbit disagreed with Newton’s theory for nearly sixty years. Similarly the stability of the planetary orbits was an unsolved problem for nearly 200 years.

Good theories are hard to come by and we cannot afford to throw them away at the first signs of a problem. This is why scientists are quite agreeable to treating such seeming counter-instances as research problems to be worked on, rather than as falsifying events. As Barry Barnes says in his T.S. Kuhn and Social Science (1982): “In agreeing upon a paradigm scientists do not accept a finished product: rather they agree to accept a basis for future work, and to treat as illusory or eliminable all its apparent inadequacies and defects.�

Dethroning a useful theory requires an accumulation of evidence and problems, and the simultaneous existence of a viable alternative. It is like a box spring mattress. One broken spring is not sufficient to make the mattress useless, since the other springs can make up for it and retain the mattress’s functionality. It takes several broken springs to make the mattress a candidate for replacement. And you only throw out the old mattress if you have a better one to replace it with, because having no mattress at all is even worse. The more powerful and venerable the theory, the more breakdowns that must occur to make scientists skeptical of its value and open to having another theory replace it.

After a theory is dethroned due to a confluence of many events, later historians might point to a single event as starting the decline or providing the tipping point that convinced scientists to abandon the theory. But this is something that happens long after the fact, and is largely a rewriting of history.

So I do not think that finding one fossil out of place will dethrone Darwin. And ID does not meet the necessary criteria for being a viable alternative anyway, since it appeals to an unavoidable inscrutability as a factor in its explanatory structure, and that is an immediate disqualification for any scientific theory.

A Theory of Justice

I have to confess that this blog has been guilty of false advertising. On the masthead, of all the items listed, the one thing I have not talked about is books and it is time to make amends.

But first some background. Last week, I spent a thoroughly enjoyable evening having an informal conversation with about 20 students in the lobby of Alumni Hall (many thanks to Carolyn, Resident Assistant of Howe for organizing it). The conversation ranged over many topics and inevitably came around to politics. I had expressed my opposition to the attack on Iraq, and Laura (one of my former students) raised the perfectly legitimate question about what we should do about national leaders like Saddam Hussein. Should we just let them be? My response was to say that people and countries need to have some principles on which to act and apply them uniformly so that everyone (without exception) would be governed by the same principles. The justifications given by the Bush administration for the attack on Iraq did not meet those conditions.

But my response did not have a solid theoretical foundation and I am glad to report that a book that I have started reading seems to provide just that.

The book is A Theory of Justice by John Rawls, in which the author tries to outline what it would take to create a system that would meet the criteria of justice as fairness. The book was first published in 1971 but I was not aware until very recently of its existence. I now find that it is known by practically everyone and is considered a classic, but as I said elsewhere earlier, my own education was extraordinarily narrow, so it is not surprising that I was unaware of it until now.

Rawls says that everyone has an intuitive sense of justice and fairness and that the problem lies on how to translate that desire into a practical reality. Rawls’ book gets off to a great start in laying out the basis for how to create a just society.

“Men are to decide in advance how they are to regulate their claims against one another and what is to be the foundation charter of their society…Among the essential features of this situation is that no one knows his place in society, his class position or social status, not does anyone know his fortune in the distribution of natural assets and abilities, his intelligence, strength, and the like…The principles of justice are chosen behind a veil of ignorance.” (my emphasis)

In other words, we have to decide what is fair before we know where we will fit into society. We have to create rules bearing in mind that we might be born to any of the possible situations that the ensuing structure might create. Right now what we have is ‘victor’s justice’, where the people who have all the power and privilege get to decide how society should be run, and their own role in it, and it should not surprise us that they see a just society as one that gives them a disproportionate share of the benefits.

Rawls argues that if people were to decide how to structure society based on this ‘veil of ignorance’ premise, they would choose two principles around which to organize things. “[T]he first requires equality in the assignment of basic rights and duties, while the second holds that social and economic inequalities, for example, inequalities of wealth and authority, are just only if they result in compensating benefits for everyone, and in particular for the least advantaged members of society. These principles rule out justifying institutions on the grounds that the hardships of some are offset by a greater good in the aggregate.”

Rawl’s argument has features similar to that young children use when sharing something, say a pizza or a cookie. The problem is that the person who gets to choose first has an unfair advantage. This problem is overcome by deciding in advance that one person divides the object into two portions while the other person gets first pick, thus ensuring that both people should feel that the ensuing distribution is fair.

(Here is an interesting problem: How can you divide a pizza in three ways so that everyone has the sense that it was a fair distribution? Remember, this should be done without precision measurements. The point is to demonstrate the need to set up structures so that people will feel a sense of fairness, irrespective of their position in the selection order.)

All this great stuff is just in the first chapter. Rawls will presumably flesh out the ideas in the subsequent chapters and I cannot wait to see how it comes out.

I will comment about the ideas in this book in later postings as I read more, because I think the ‘veil of ignorance’ gives a good framework for understanding how to make judgments about public policy.

Urban legends in academia?

Did you hear the story about the college professor who asked his class to write a mid-term essay on “Why George Bush is a war criminal,� and then gave an F grade to a student who had been offended by the assignment and had instead turned in one on “Why Saddam Hussein is a war criminal�?

I wrote about this in an op-ed piece that appeared in today’s (March 4, 2005) Plain Dealer.

You will be asked by the site to fill in your zip-code, year of birth, and gender for some kind of demographic survey. It takes about 10 seconds.

Update on 3/14/05

I received a call today from a person associated with Students for Academic Freedom informing me that this op-ed had triggered the release of more information on their website, where more details are given.

Although the student referred to had not in fact given this testimony at the Colorado Senate hearings as had been alleged earlier, the level of detail (which had not been released until now) provided on the SAF website is sufficient to remove this story from the category of urban legends since it does give some names and places and dates. But a judgment on whether this constitutes academic bullying will have to await the release of the facts of the case on what actually transpired between professor and student. My contact at SAF says that the incident is still under investigation.

Update on the update (3/15/05): It gets curioser and curioser.

The blog Canadian Cynic reports that new information on this case has come out and that Horowitz is now backtracking on almost all of the key charges that were originally made. Canadian Cynic highlights Horowitz’s statements now that “Some Of Our Facts Were Wrong; Our Point Was Right” and “”I consider this an important matter and will get to the bottom of it even if it should mean withdrawing the claim.”

See the article on the website Inside Higher Education. It seems to be the most authoritative source of information on this case.

Content-free political labels

Here’s a quiz. Who said the following:

“In his inaugural address, Mr. Bush calls 9/11 the day “when freedom came under attack.� This is sophomoric. Osama did not send fanatics to ram planes into the World Trade Center because he hates the Bill of Rights. He sent the terrorists here because he hates our presence and policies in the Middle East.

…

The 9/11 killers were over here because we are over there. We were not attacked because of who we are but because of what we do. It is not our principles they hate. It is our policies. U.S. intervention in the Middle East was the cause of the 9/11 terror. Bush believes it is the cure. Has he learned nothing from Iraq?

In 2003, we invaded a nation that had not attacked us, did not threaten us, and did not want war with us to disarm it of weapons it did not have. Now, after plunging $200 billion and the lives of 1,400 of our best and bravest into this war and killing tens of thousands of Iraqis, we have reaped a harvest of hatred in the Arab world and, according to officials in our own government, have created a new nesting place and training ground for terrorists to replace the one we lately eradicated in Afghanistan.”

Was this said by some radical leftist? Some long-haired peacenik? Ward Churchill? Actually, it was Pat Buchanan, a staffer for Richard Nixon and long-time Republican stalwart writing in a recent issue of the magazine The American Conservative.

Ok, here’s another writer:

“The US economy is headed toward crisis, and the political leadership of the country–if it can be called leadership–is preoccupied with nonexistent weapons of mass destruction in the Middle East.

…

Oblivious to reality, the Bush administration has proposed a Social Security privatization that will cost $4.5 trillion in borrowing over the next 10 years alone! America has no domestic savings to absorb this debt, and foreigners will not lend such enormous sums to a country with a collapsing currency–especially a country mired in a Middle East war running up hundreds of billions of dollars in war debt.

A venal and self-important Washington establishment combined with a globalized corporate mentality have brought an end to America’s rising living standards. America’s days as a superpower are rapidly coming to an end. Isolated by the nationalistic unilateralism of the neoconservatives who control the Bush administration, the US can expect no sympathy or help from former allies and rising new powers.â€?

Who is this Bush-hater? Michael Moore? No, it was none other than Paul Craig Roberts, Assistant Secretary of the Treasury in the Reagan administration and former Associate Editor of the Wall Street Journal editorial page and Contributing Editor of National Review.

The point of my using these quotes is to illustrate my view that the labels ‘liberal’ or ‘conservative,’ ‘Democratic’ or ‘Republican’ have ceased to be meaningful in identifying people’s political positions on many issues. They may have at one time identified particular unifying political philosophies, but now have ceased to have content in that there are no longer any clear markers that one can point to that identify those positions.

Not all political labels have ceased to have content but those four broad-brush categories in particular are used more as terms of political abuse than for any clarifying purpose. Their only purpose is to set up fake debates on television’s political yell shows. If you advertise that you have a liberal and conservative on your panel (or a Democrat and Republican), you can claim that your program is ‘fair and balanced’ even though both people pretty much say the same thing on major policy issues, differing only on minor tactical points or on style.

It makes more sense, rather than identifying and aligning with people on the basis of these meaningless labels, to form alliances on specific issues based on where they stand with respect to those issues. And when one does that, one finds that many of the old divisions melt away.

The greater danger of labels (whether they be of religion, nationality, or politics) is that they are used to divide us and herd us into boxes and make us think in terms of what we should believe and who are allies should be than what we really want them to be. They are being used as weapons to divide people into ineffective warring factions and thus prevent them from finding commonalities that might lead to concerted action.

I do not agree with Buchanan or Roberts on everything they say. On some things I strongly disagree. But unlike the members of the Third-Tier Punditâ„¢ brigade who should be ignored, they are serious people who often have useful information or perspectives to share and I read them regularly.

Dismissing the ideas of some people simply because of the label attached to them makes as little sense as supporting other people for the same reason.

Putting thought police in the classroom

Most of you would have heard by now about the bill pending in the Ohio legislature (Senate Bill 24) to “establish the academic bill of rights for higher education.�
The bill is both silly and misguided. It mixes motherhood and apple pie language (“The institution shall provide its students with a learning environment in which the students have access to a broad range of serious scholarly opinion pertaining to the subjects they study.�) with language that is practically begging students with even minor grievances to complain to higher authorities.

In a previous posting, I spoke about how lack of trust leads to poor learning conditions and that we need to recreate the conditions under which trust can flourish. This bill goes in the wrong direction because it effectively creates a kind of ‘thought police’ mentality, where any controversial word or idea in class can end up causing a legal battle.

Let me give you an example. The bill says “curricula and reading lists in the humanities and social studies shall respect all human knowledge in these areas and provide students with dissenting sources and viewpoints.�

As an instructor, how would I respect “all� the knowledge in the area? What do we even mean by the word “knowledge.� How do we even separate knowledge in the humanities and social sciences from those in the sciences? What constitutes “dissenting viewpoints?� And how far should “dissenting� be taken? If a particular point of view is not mentioned by the instructor, is that grounds for complaint?

Take another example.

“Students shall be graded solely on the basis of their reasoned answers and appropriate knowledge of the subjects and disciplines they study and shall not be discriminated against on the basis of their political, ideological, or religious beliefs.�

Grading is an art not a science. It is, at some level, a holistic judgment made by an instructor. To be sure the instructor has a deep ethical obligation to the profession to assign the grade with as much competence and impartiality as he or she can muster. But even granting that, a letter grade or a point allocation for an assignment is not something that can be completely objectified. Give the same essay or problem to two different teachers and they will likely arrive at different grades even if it were “graded solely on the basis of their reasoned answers and appropriate knowledge.� And this can occur irrespective of how agreeable or disagreeable the student’s views might be perceived by the instructor. So if a student complains about a grade, how can this be adjudicated?

As I said in a previous posting, the reason we currently have so many rules in our classrooms is that we seem to have forgotten the purpose of courses, and have lost that sense of trust that is so vital to creating a proper learning atmosphere.

This bill, rather than increasing trust in the classroom, will decrease it. Because as soon as there is legislation prescribing what can and cannot be done in the classroom, it will inevitably lead to teaching and grading issues ending up in the courtroom. And in order to avoid that tedious and expensive process, universities will start instituting detailed lists of rules about what can and cannot be done in the classroom, and teachers will start teaching and assessing defensively, simply to avoid the chance of litigation.

Is this what we want or need?

POST SCRIPT

Tomorrow (Thursday, March 3) from 7:00-9:00 pm in Thwing Ballroom, Case’s Hindu Students Association is hosting an inter-religious dialogue on how to reconcile a belief in God in light of major disasters like the recent tsunami.

There will be a panel of religious scholars representing all the major religious traditions (drawn from the faculty of the Religion department at Case and elsewhere) and plenty of time for discussions. I will be the moderator of the discussion.

The event is free and open to the public and donations will be accepted for tsunami relief efforts.

Living in a reality-free world

Here is some news to curl your hair.

The Harris Poll® #14 of February 18, 2005 reports that:

– 47 percent of Americans believe that Saddam Hussein helped plan and support the hijackers who attacked the U.S. on September 11, 2001;
– 44 percent believe that several of the hijackers who attacked the U.S. on September 11 were Iraqis; and
– 36 percent believe that Iraq had weapons of mass destruction when the U.S. invaded that country.

Virtually no one who has followed these stories believes any of the above to be true. And this poll was released just last week, long after the David Kay and Charles Duelfer reports were made public, putting to rest all the overblown claims that were used to justify the attack on Iraq.

Also something that experts do believe to be true, that Saddam Hussein was prevented from developing weapons of mass destruction by the U.N. weapons inspectors, is supported by only 46 percent.

How is it that so many Americans seem to be living in a reality-free world?

The reason is that such falsehood as the ones listed above are strongly implied by influential people and uncritically reported in the media, or influential people stay silent when such falsehoods are propagated.

Take for example a speech made just last week (on February 17, 2005) by California congressman Christopher Cox at the Conservative Political Action Conference. Michelle Goldberg of Salon was at the conference and reports his exact words: “We continue to discover biological and chemical weapons and the facilities to make them inside of Iraq, and even more about their intended use, including that a plan to distribute sarin, and the lethal poison ricin — in the United States and Europe — was actively being pursued as late as March 2003.â€?

And who were the members of the audience who did not contradict Cox as this nonsense was being spouted? Michelle Goldberg reports that among those “seated at the long presidential table at the head of the room were Henry Hyde, chairman of the House International Relations Committee, Kansas Senator Sam Brownback, Missouri Senator Norm Coleman, Dore Gold, foreign policy advisor to former Israeli Prime Minister Benjamin Netanyahu, and NRA president Kayne Robinson.� Cox’s comments were made while introducing Vice President Dick Cheney, who gave the keynote address.

Now it is possible to carefully deconstruct the congressman’s words so that some semblance of truth can be salvaged. But that would involve re-defining words like ‘discovered’ and ‘weapons’ and ‘facilities’ and ‘plan’ in ways that would make Clinton’s parsing of the word ‘is’ seem like a model of transparency.

So what are we to make of political leaders who can say such deliberately misleading things? What are we to make of other politicians who know the facts but choose to remain silent while the public is led astray? And what are we to make of the national media who spend enormous amounts of time and space on issues like Michael Jackson’s trial but do not provide the kind of scrutiny, factual information, and context that would make politicians more cautious about what they say?

Politicians who mislead the public may be just cynical in that they know the truth and are just saying things for the sake of political expediency. But the danger with allowing this kind of talk to go unchallenged is that it creates an echo-chamber in which people hear the same false things from different directions and start to think it must be true. When people start believing their own propaganda, then they have entered a reality-free zone and this can lead to disastrous consequences.

George Orwell in his essay Politics and the English Language (1946) wrote “Political language…is designed to make lies sound truthful and murder respectable, and to give an appearance of solidity to pure wind.� The sad truth is that Cox’s speech is by no means the only, or even the worst, example of this kind of linguistic chicanery. One has only to go back to the days leading up to the invasion of Iraq to see even more egregious examples of deception by the highest ranking members of the government, and timidity and silence from the supposed watch-dogs in the Congress and media.

Is it any wonder that so many people live in a world that does not exist?

Creationism and moral decay

In the previous posting, I said that the reason that there is such hostility to the teaching of evolutionary theory by ID advocates and young-Earth creationists is that they feel that it implies a lack off special status for human beings, which leads to atheism, which has led to the current state of moral decay in the US from a more wholesome past. They feel that eliminating the teaching of evolution is the first step on the road to moral redemption.

There are many flaws in this line of reasoning but for the moment I want to look at one feature and pose the question as to why such people think that the moral state of America is in worse shape now than it was in the past.
[Read more…]

Natural selection and moral decay

In a previous posting, I discussed why some religious people found evolutionary theory so upsetting. It was because natural selection implies that human beings were not destined or chosen to be what they are.

While I can understand why this is upsetting to religious fundamentalists who believe they were created specially in God’s image and are thus part of a grand cosmic plan, there is still a remaining puzzle and that is why they are so militant in trying to have evolution not taught in schools or its teaching to be undermined by inserting fake cautions about its credibility. After all, if a person dislikes evolutionary theory for whatever reason, all they have to do is not believe it. [Read more…]

The home of the brave? Or the fearful?

I have done the people of Ohio an injustice. In a previous posting, I said that sometimes it seems to me that there is no half-baked idea that originates anywhere in the known universe that does not quickly find influential adherents anxious to institutionalize it in Ohio.

This was a slur on the people of Ohio implying as it does that we are merely followers. It appears that influential Ohio politicians are quite capable of coming up with original half-baked ideas all on their own. Evidence of this comes from the introduction of Ohio Senate Bill 9 that seeks to expand the provisions of the USA PATRIOT Act (which is already a very disturbing law) and apply these extensions to the people of Ohio.

Jeffrey M. Gamso, Legal Director of the ACLU of Ohio stated the case against the Ohio bill in his testimony before the Ohio’s Senate Judiciary Committee:

“The ACLU of Ohio opposes many of the provisions of S.B. 9. The proposed legislation makes criminal what is already a crime (and may criminalize obedience to the law); requires that people incriminate themselves and in some cases makes criminal their failure to do so; provides sweeping powers to law enforcement to demand identification from wholly innocent persons. It does all that while doing remarkably little to make us either safer or more secure. Like the USA PATRIOT Act, S.B. 9 effects a needless expansion of wide-ranging police powers which threatens the very rights and freedoms that we are struggling to protect.

There are five broad categories of problematic bad legislation tied together in S.B. 9: (1) Legislation which simply duplicates already existing federal law; (2) legislation which provides government with broad powers to investigate and prosecute even wholly innocent activity; (3) legislation which prohibits possession of that which may be misused rather than the misuse itself; (4) legislation which attempts to restrain the people of Ohio from expressing their disapproval of the actions of the government, and (5) legislation which forces people to incriminate themselves. In addition, S.B. 9 may require, in some circumstances, government employees actually to violate existing law – and does so without shielding them from the consequences of such a violation.�

As a proud card-carrying member of many years of the American Civil Liberties Union, I have major concerns with the rapid encroachment of civil liberties in this country under the guise of fighting terrorism.

What amazes me is that so many people are so scared of the possibility of potential terrorist acts that they are willing to let politicians dismantle even the provisions of the Bill of Rights. It is a disturbing feature of modern American political life that people can be so easily terrified that they so surrender without a fight what they should hold most dear. It seems like people are unable to make judgments about how safe is safe enough.

One way to make such a comparison is to compare the probabilities of two scenarios. On the one hand, there is the probability that we are harmed by some terrorist activity that this law would have prevented if had been enacted. The other is the probability that this law once enacted, instead of being used to protect us, is used against innocent people. Which do you think is more likely?

For me this is a no-brainer. The chances of being the victim of a terrorist attack are very small. Yet, if history is any judge, the chances that laws introduced under the guise of protecting us from ‘outsiders’ will eventually be used against us instead is relatively much higher.

So we should oppose this legislation and also seek to sustain the sunset provisions of the USA PATRIOT Act when they fall due at the end of this year.

To find out what you can do, go here.

What makes us good at learning some things and not others?

One of the questions that students ask me is why it is that they find some subjects easy and others hard to learn. Students often tell me that they “are good� at one subject (say writing) and “are not good� at another (say physics), with the clear implication that they feel that there is something intrinsic and immutable about them that determines what they are good at. It is as if they see their learning abilities as being mapped onto a multi-dimensional grid in which each axis represents a subject, with their own abilities lying along a continuous scale ranging from ‘awful’ at one extreme to ‘excellent’ at the other. Is this how it is?

This is a really tough question and I don’t think there is a definitive answer at this time. Those interested in this topic should register for the free public lecture by Steven Pinker on March 14.

Why are some people drawn to some areas of study and not to others? Why do they find some things difficult and others easy? Is it due to the kind of teaching that one receives or parental influence or some innate quality like genes?

The easiest answer is to blame it on genes or at least on the hard-wiring of the brain. In other words, we are born the way we are, with gifts in some areas and deficiencies in others. It seems almost impossible to open the newspapers these days without reading that scientists have found the genes that ‘cause’ this or that human characteristic so it is excusable to jump to genes as the cause of most inexplicable things.

But that is too simple. After all, although the brain comes at birth with some hard-wired structures, it is also quite plastic and the direction in which it grows is also strongly influenced by the experiences it encounters. But it seems that most of the rapid growth and development occurs fairly early in life and so early childhood and adolescent experiences are important in determining future directions.

But what kinds of experiences are the crucial ones for determining future academic success? Now things get more murky and it is hard to say which ones are dominant. We cannot even say that the same factors play the same role for everyone. So for one person, a single teacher’s influence could be pivotal. For another, it could be the parent’s influence. The influences could also be positive or negative.

So there is no simple answer. But I think that although this is an interesting question, the answer has little practical significance for a particular individual at this stage of their lives in college. You are now what you are. The best strategy is to not dwell on why you are not something else, but to identify your strengths and use them to your advantage.

It is only when you get really deep into a subject (any subject) and start to explore its foundations and learn about its underlying knowledge structure that you start to develop higher-level cognitive skills that will last you all your life. But this only happens if you like the subject because only then will you willingly expend the intellectual effort to study it in depth. With things that we do not care much about, we tend to skim on the surface, doing just the bare minimum to get by. This is why it is important to identify what you really like to do and go for it.

You should also identify your weaknesses and dislikes and contain them. By “containâ€? I mean that there is really no reason why at this stage you should force yourself to try and like (say) mathematics or physics or Latin or Shakespeare or whatever and try to excel in them, if you do not absolutely need to. What’s the point? What are you trying to prove and to whom? If there was a really good reason that you needed to know something about those areas now or later in life, the higher-level learning skills you develop by charging ahead in the things you like now could be used to learn something that you really need to know later.

I don’t think that people have an innate “limitâ€?, in the sense that there is some insurmountable barrier that prevents them from achieving more in any area. I am perfectly confident that some day if you needed or wanted to know something in those areas, you would be able to learn it. The plateau or barrier that students think they have reached is largely determined by their inner sense of “what’s the point?â€?

I think that by the time they reach college, most students have reached the “need to know� stage in life, where they need a good reason to learn something. In earlier K-12 grades, they were in the “just in case� stage where they did not know where they would be going and needed to prepare themselves for any eventuality.

This has important implications for teaching practice. As teachers, we should make it our goal to teach in such a way that students see the deep beauty that lies in our discipline, so that they will like it for its own sake and thus be willing to make the effort. It is not enough to tell them that it is “useful� or “good for them.�

In my own life, I now happily learn about things that I would never have conceived that I would be interested in when I was younger. The time and circumstances have to be right for learning to have its fullest effect. As Edgar says in King Lear: “Ripeness is all.�

(The quote from Shakespeare is a good example of what I mean. If you had told me when I was an undergraduate that I would some day be familiar enough with Shakespeare to quote him comfortably, I would have said you were crazy because I hated his plays at that time. But much later in life, I discovered the pleasures of reading his works.)

So to combine the words from the song by Bobby McFerrin, and the prison camp commander in the film The Bridge on the River Kwai, my own advice is “Don’t worry. Be happy in your work.â€?

Sources:

John D. Bransford, Ann L. Brown, and Rodney R. Cocking, eds., How People Learn, National Academy Press, Washington D.C.,1999.

James E. Zull, The Art of Changing the Brain, Stylus Publishing, Sterling, VA, 2002.