Using education to entrench privilege

Suppose you are in charge of a community college and there turns out to be a huge demand for math and English classes so that students are being repeatedly turned away because they are full. You might think that it is a good thing that people are seeking more education and that the solution is to open up more classes to meet that demand by (say) hiring more math and English teachers. [Read more...]

Critical thinking and argumentation

If there is one thing that people can agree on as a universal good in education, it is that we should seek to increase the critical thinking abilities in people. Actually, that is not quite true. During the days when the debate over intelligent design was raging in Ohio in 2003, one letter to the editor by an ID advocate dismissed [Read more...]

Teenager faces down Kansas governor and school principal

High schooler Emma Sullivan refuses to apologize to the governor of Kansas Sam Brownback for criticizing him on Twitter. The governor’s staff apparently scours the internet for unflattering things about him and noticed her tweet and reported her to her school principal who, rather than stand up for Emma’s free speech rights (after all, if making fun of a politician isn’t allowed by the First Amendment, what is?) demanded that she apologize.

You knew from the beginning that this could not help but end badly for Brownback and so it has. He is now forced to apologize for his staff over-reacting. The school district has also backed off in its demand for an apology.

Once again, I think that this is a victory for the internet. Emma received a lot of support from the blogosphere and it may have helped her stand firm against the bullying.

Way to go, Emma.

Case Connection Zone

One of the best things about working at Case Western Reserve University is that it has been very forward-looking and supportive in providing technology to serve the needs of its students, employees, and the community.

In the early days of the internet CWRU, with its Freenet system, was the first in the nation to provide free internet access to anyone who had a dial-up modem. It later was the first university campus to have an entirely fiber-optic network going to every office, classroom, and dorm room on the campus.

In partnerships with other local non-profit groups, CWRU has been expanding access to free broadband access to city dwellers. This video (admittedly also a plug for the university) shows a new initiative to provide free gigabit broadband fiber-optic network access to the campus community and an adjacent neighborhood to research what kinds of new uses might emerge, with an eye to expanding the reach of the network.

The secrets of an academic ghostwriter

The Chronicle of Higher Education recently had an article by someone who has made a good living (about $66,000 this year) by writing custom research papers on almost any topic for undergraduate and graduate students who hire him to do their assignments.

I’ve written toward a master’s degree in cognitive psychology, a Ph.D. in sociology, and a handful of postgraduate credits in international diplomacy. I’ve worked on bachelor’s degrees in hospitality, business administration, and accounting. I’ve written for courses in history, cinema, labor relations, pharmacology, theology, sports management, maritime security, airline services, sustainability, municipal budgeting, marketing, philosophy, ethics, Eastern religion, postmodern architecture, anthropology, literature, and public administration. I’ve attended three dozen online universities. I’ve completed 12 graduate theses of 50 pages or more. All for someone else.

His strategy was to collect the minimal information necessary from Wikipedia and other online sources and simply write everything down, cutting and pasting quotes, and using filler language to get to the necessary word count, without rewriting or editing or polishing.

After I’ve gathered my sources, I pull out usable quotes, cite them, and distribute them among the sections of the assignment. Over the years, I’ve refined ways of stretching papers. I can write a four-word sentence in 40 words. Just give me one phrase of quotable text, and I’ll produce two pages of ponderous explanation. I can say in 10 pages what most normal people could say in a paragraph.

I’ve also got a mental library of stock academic phrases: “A close consideration of the events which occurred in ____ during the ____ demonstrate that ____ had entered into a phase of widespread cultural, social, and economic change that would define ____ for decades to come.” Fill in the blanks using words provided by the professor in the assignment’s instructions.

The reason he gets away with this is because this is what some students do on their own. For them too, their first version is the one they hand in as their ‘finished’ work, so the roughness of the submitted manuscript must seem familiar to the teacher. As the author says:

I don’t ever edit my assignments. That way I get fewer customer requests to “dumb it down.” So some of my work is great. Some of it is not so great. Most of my clients do not have the wherewithal to tell the difference, which probably means that in most cases the work is better than what the student would have produced on his or her own. I’ve actually had customers thank me for being clever enough to insert typos. “Nice touch,” they’ll say.

As a writing generalist myself, I was vaguely curious about whether I could be as successful a ghostwriter, assuming that I could overcome any scruples. I don’t think I could simply because over the years I have developed habits that would give me away immediately. I would not be able to avoid being opinionated and this would undoubtedly set off suspicions. I am also somewhat obsessive about avoiding typos and grammatical errors, repeatedly rewriting and editing even for my blog posts. My books may not be great works of literature but they are ‘clean’ in the sense that they have very few or no basic errors of this sort. All this attention to detail would slow me down too much, while also likely to set off alarm bells for the reader. As an academic hired gun, I would be a bust.

I was of course bothered by students passing off other people’s work as their own and wondered how widespread it was. But I was also impressed with the writer’s ability to churn out papers on topics for which he had no training and yet be able to fool the student’s teachers and even their graduate thesis advisors into thinking their students had written them.

This article makes for fascinating but disturbing reading and is as much an indictment of the way our educational system is structured, that enables such practices to pass undetected, as it is of the students who use ghostwriters.

College as a Disney World of Learning

(Talk given at Case Western Reserve University’s Share the Vision program, Severance Hall, Friday, August 21, 2009 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the common reading book selection Three Cups of Tea by Greg Mortenson and David Oliver Relin. Mortenson will be the speaker at the annual fall convocation to be held on Wednesday, August 26, 2009 in Severance Hall at 4:30 pm.)

As I read the book Three Cups of Tea, two stories struck me. One begins on page 202 and is that of the little boy Mohammed Aslam Khan who was sent by his father alone on a perilous journey downriver in frigid waters, all so that he might get a chance at an education. Despite all the odds against him, he not only survived the trip but got a good education and returned to the village to become an educational leader.

The other story is on page 31 where Mortenson describes his amazement when he saw eighty two children assemble by themselves and do their lessons on their own in the open, in the cold, some writing on the ground with sticks, since the village could only afford a teacher for three days a week, and on the other days they were on their own.

As Mortenson said, “Can you imagine a fourth-grade class in America, alone, without a teacher, sitting there quietly and working on their lessons?”

Why were the people in that remote region of Pakistan willing to go through so much in order to get an education? Compare the situation in the US where learning is often seen as something to be avoided, and the complaints that some teachers get when they cover too much ground. When schools are closed or lessons cancelled due to some emergency, it is usually a cause for cheering amongst students. As a colleague of mine here said recently, education may be the only thing in the US where people actually want less than what they pay for.

There are of course classes, teachers, and students in the US where learning for its own sake is valued. But these are unfortunately few. But I do not believe that there is any fundamental difference between the children in those remote villages of Pakistan and Afghanistan and those in the US that explains this difference in attitude.

What may be true is that America suffers, if that is the right word, from too easy access to education. Schooling is fairly easily available and, at least in the K-12 sector, is free. A good analogy is with food, which is also freely and cheaply available in the US, when compared with other countries. And we waste and throw away vast amounts of it. I am sure your mothers pleaded with you to eat your vegetables, invoking images of starving children in China who would gladly eat with relish the food that you want to dump in the trash. Actually given the economic crisis in the US and the rapidly rising economic power of China, soon Chinese mothers might be pleading with their spinach-rejecting children to think of poor starving children in the US.

Students in the US, because of the ease and abundance of educational opportunities, have to be exhorted to take advantage of these abundant resources, just like they have to be coaxed to eat their broccoli, and this may be devaluing education in students’ eyes, because people tend to not value the things that are easily available.

This is why the story of the immense struggles and sacrifices made by the villagers that Mortenson worked with to build their schools is so inspiring. They realized that education is a precious gift to be cherished, not something whose availability can be taken for granted.

All of you are now embarking on four years of education here at Case Western Reserve University. Some people may tell you that college will be the happiest time in your lives. I disagree. In fact, it would be very sad if the happiest years of your life were over by the age of twenty-two. So I hope that you will have much happier times in the future.

But there is one aspect in which these four years will be a unique experience that you must take advantage of to the fullest. It is the one time in your life when you will be surrounded by people who want nothing else but to help you learn. The world-class faculty here, who are experts on all manner of things, will share their knowledge and expertise freely and willingly. Here you will get free access to incredible libraries full of books, journals, magazines, audio-visual materials, and newspapers, and to librarians who are positively eager to help you use them. And it is all available to you just for the asking. Once you graduate and go out, that opportunity is gone.

Of course, all this is not technically ‘free’ since you are paying tuition that, despite the extraordinary fund-raising abilities of our president, is still considerable. But the way to think of tuition fees is the way you would the admission price to Disney World or other amusement parks. It is not cheap to get in but once you are in, people try to get as much out of their time there as possible. It would be absurd to spend all your time sitting on a bench eating ice cream or surfing the web or sleeping.

You should have that attitude during the years you spend here. Think of Case Western Reserve University as the Disney World of learning. You have paid the admission fee in terms of grades and tuition. Now that you are in, rather than get by with minimal work, you should try to get in as much learning as possible, formally in classes, and informally in all the talks and seminars and casual discussions with teachers and fellow students. Once you develop that attitude towards learning, you will find that it is much more fun than roller coaster rides and with none of the accompanying motion sickness.

I am lucky in that I actually work here and take full advantage on a daily basis of the knowledge that is so freely available. And I would urge you to do the same. In fact, as soon as this program is over, and you have some free time, you should go over to the library and see what they offer, and you should go to all the museums that are right here in University Circle, as the first steps in a four-year adventure of learning.

Trust me, you will never regret it.

POST SCRIPT: The story of Genesis as told by Eddie Izzard

Much more interesting than the original. Makes more sense, too.

Collective good versus private profit

One of the clichés of academia which even non-academics know is “publish or perish.” In its most common understanding, it implies that those who publish more are perceived as productive scholars, worthy of recruitment and promotion.

But there are other reasons for publishing. One is to establish priority for one’s ideas. In academia, ideas are the currency that matter and those who have good ideas are seen as creative people. So people publish to ensure that they receive the appropriate credit.

Another reason for publishing is to put the ideas into public circulation so that others can use them and build on them to create even more knowledge. Knowledge thrives on the open exchange of information and the general principle in academia is that all knowledge should be open and freely available so that everyone can benefit from it.

This is not, of course, the case, in the profit-driven private sector where information is jealously guarded so that the maximum profit can be obtained. This is not unreasonable in many cases. After all, without being profitable, companies would go out of business and many of the innovations we take for granted would not occur. So the knowledge is either guarded jealously (say like the formula for Coca Cola) or is patented so that other users have to pay for the privilege of using it.

But the open-information world of academia can collide with the closed, profit-making corporate world. Nowhere is this most apparent than in the drug industry. Much of the funding for medical and drug research comes from the government via agencies like the National Institutes of Health, and channeled through university and hospital researchers. These people then publish their results. But that knowledge is then often built on by private drug companies that manufacture drugs that are patented and sold for huge profits. These companies often use their immense legal resources to extend the effective lifetime of their patents so that they can profit even more.

Another example of a collision between the public good and private profit was the project to completely map the human genome. This government-funded project was designed to be open, with the results published and put into the public domain. Both heads of the Human Genome Project, first James Watson and then Francis Collins, strongly favored the open release of whatever was discovered, because of the immense potential benefits to the public. They created a giant public database into which researchers could insert their results, enabling others to use them. (To see what is involved in patenting genomic information, see here.)

But then Craig Venter, head of the private biotechnology company Celera Genomics, decided that his company would try to map the genome and make it proprietary information, and create a fee-based database,. This was fiercely resisted by the scientific community who accelerated their efforts to map the genome first and make the information open to all. The race was on and the scientific community succeeded in its goal of making the information public. Information on how to access the public database can be found here.

Many non-academics, like the journalist writing about faculty cars, simply do not understand this powerful desire amongst academics for open-access to information. I recall the discussion I had with my students regarding the film Jurassic Park. I hated the film for many reasons and said how bizarre it was that the discoverer of the process by which dinosaurs had been recreated from their DNA, a spectacular scientific achievement, had kept his knowledge secret in order to create a dinosaur theme park and make money. I said that this was highly implausible. A real scientist would have published his results to establish his claim as the original discoverer and made the information public so that others could build on it. But some of my students disagreed. They thought that it was perfectly appropriate that the first thought of the scientist was how to make a lot of money off his discovery rather than spread knowledge.

It is true that nowadays scientists and universities are increasingly seeking to file patents and create spin-off companies to financially benefit from their discoveries. Michael Moore talks about how things have changed and how the drive to make money is harming the collective good;

Thinking about that era, back in the first half of the 20th century, where you had for instance the man who invented the kidney-dialysis machine. He didn’t want the patent for it, he felt it belonged to everybody. Jonas Salk and the polio vaccine, again, he wouldn’t patent it. The famous quote for him is, “Would you patent the sun? It belongs to everyone.” He wasn’t doing this to become a millionaire. He was doing it because it was the right thing to do. During that era, that’s the way people thought.

It may be that I am living in the past and that those students who thought I was crazy about not making money as the prime motivator for scientists and other academics have a better finger on the pulse than I. Perhaps new knowledge is now not seen so clearly as a public good, belonging to the world, to be used for the benefit of all. If so, it is a pity.

POST SCRIPT: Nelson Mandela, terrorist

Did you know that all this time, the US government considered Nelson Mandela to be a terrorist?

Precision in language

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Some time ago, a commenter to this blog sent me a private email expressing this view:

Have you ever noticed people say “Do you believe in evolution?” just as you would ask “Do you believe in God?” as if both schools of thought have equal footing? I respect others’ religious beliefs as I realize I cannot disprove God just as anyone cannot prove His existence, but given the amount of evidence for evolution, shouldn’t we insist on asking “Do you accept evolution?”

It may just be semantics, but I feel that the latter wording carries an implied affirmation just as “Do you accept that 2+2=4?” carries a different meaning than “Do you believe 2+2=4?”

I guess the point I’m trying to make is that by stating something as a belief, it opens the debate to the possibility that something is untrue. While this may fine for discussions of religion, shouldn’t the scientific community be more insistent that a theory well supported by physical evidence, such as evolution, is not up for debate?

It’s a good point. To be fair, scientists themselves are partly responsible for this confusion because we also say that we “believe” in this or that scientific theory, and one cannot blame the general public from picking up on that terminology. What is important to realize, though, is that the word ‘believe’ is being used by scientists in a different sense from the way it is used in religion.

The late and deeply lamented Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy, who called himself a “radical atheist” puts it nicely (thanks to onegoodmove):

First of all I do not believe-that-there-is-not-a-god. I don’t see what belief has got to do with it. I believe or don’t believe my four-year old daughter when she tells me that she didn’t make that mess on the floor. I believe in justice and fair play (though I don’t know exactly how we achieve them, other than by continually trying against all possible odds of success). I also believe that England should enter the European Monetary Union. I am not remotely enough of an economist to argue the issue vigorously with someone who is, but what little I do know, reinforced with a hefty dollop of gut feeling, strongly suggests to me that it’s the right course. I could very easily turn out to be wrong, and I know that. These seem to me to be legitimate uses for the word believe. As a carapace for the protection of irrational notions from legitimate questions, however, I think that the word has a lot of mischief to answer for. So, I do not believe-that-there-is-no-god. I am, however, convinced that there is no god, which is a totally different stance. . .

There is such a thing as the burden of proof, and in the case of god, as in the case of the composition of the moon, this has shifted radically. God used to be the best explanation we’d got, and we’ve now got vastly better ones. God is no longer an explanation of anything, but has instead become something that would itself need an insurmountable amount of explaining…

Well, in history, even though the understanding of events, of cause and effect, is a matter of interpretation, and even though interpretation is in many ways a matter of opinion, nevertheless those opinions and interpretations are honed to within an inch of their lives in the withering crossfire of argument and counterargument, and those that are still standing are then subjected to a whole new round of challenges of fact and logic from the next generation of historians – and so on. All opinions are not equal. Some are a very great more robust, sophisticated and well supported in logic and argument than others.

When someone says that they believe in god, they mean that they believe something in the absence of, or even counter to, the evidence, and even to reason and logic. When scientists say they believe a particular theory, they mean that they believe that theory because of the evidence and reason and logic, and the more evidence there is, and the better the reasoning behind it, the more strongly they believe it. Scientists use the word ‘belief’ the way Adams says, as a kind of synonym for ‘convinced,’ because we know that no scientific theory can be proven with 100% certainty and so we have to accept things even in the face of this remaining doubt. But the word ‘believe’ definitely does not carry the same meaning in the two contexts.

This can lead to the generation of confusion as warned by the commenter but what can we do about it? One option is, as was suggested, to use different words, with scientists avoiding use of the word ‘believe.’ I would have agreed with this some years ago but I am becoming increasingly doubtful that we can control the way that words are used.

For example, there was a time when I used to be on a crusade against the erroneous use of the word ‘unique’. The Oxford English Dictionary is pretty clear about what this word means:

  • Of which there is only one; one and no other; single, sole, solitary.
  • That is or forms the only one of its kind; having no like or equal; standing alone in comparison with others, freq. by reason of superior excellence; unequalled, unparalleled, unrivalled.
  • Formed or consisting of one or a single thing
  • A thing of which there is only one example, copy, or specimen; esp., in early use, a coin or medal of this class.
  • A thing, fact, or circumstance which by reason of exceptional or special qualities stands alone and is without equal or parallel in its kind.

It means, in short, one of a kind, so something is either unique or it is not. There are no in-betweens. And yet, you often find people saying things like “quite unique” or “very unique” or “almost unique.” I used to try and correct this but have given up. Clearly, people in general think that unique means something like “rare” and I don’t know that we can ever change this even if we all become annoying pedants, correcting people all the time, avoided at parties because of our pursuit of linguistic purity.

Some battles, such as with the word unique are, I believe, lost for good and I expect the OED to add the new meaning of ‘rare’ some time in the near future. It is a pity because then we would then be left with no word with the unique meaning of ‘unique’, but there we are. We would have to say something like ‘absolutely unique’ to convey the meaning once reserved for just ‘unique.’

In science too we often use words with precise operational meanings while the same words are used in everyday language with much looser meanings. For example, in physics the word ‘velocity’ is defined operationally by the situation when you have an object moving along a ruler and, at two points along its motion, you take ruler readings and clock readings, where the clocks are located at the points where the ruler readings are taken, and have been previously synchronized. Then the velocity of the moving object is the number you get when you take the difference between the two ruler readings and divide by the difference between the two clock readings.

Most people (especially sports commentators) have no idea of this precise meaning when they use the word velocity in everyday language, and often use the word synonymously with speed or, even worse, acceleration, although those concepts have different operational meanings. Even students who have taken physics courses find it hard to use the word in its strict operational sense.

Take, for another example, the word ‘theory’. By now, as a result of the intelligent design creationism (IDC) controversy, everyone should be aware that the way this word is used by scientists is quite different from its everyday use. In science, a theory is a powerful explanatory construct. Science depends crucially on its theories because they are the things that give it is predictive power. “There is nothing so practical as a good theory” as Kurt Lewin famously said. But in everyday language, the word theory is used as meaning ‘not factual,’ something that can be false or ignored.

I don’t think that we can solve this problem by putting constraints on how words can be used. English is a wonderful language precisely because it grows and evolves and trying to fix the meanings of words too rigidly would perhaps be stultifying. I now think that we need to change our tactics.

I think that once the meanings of words enter mainstream consciousness we will not be successful in trying to restrict their meanings beyond their generally accepted usage. What we can do is to make people aware that all words have varying meanings depending on the context, and that scientific and other academic contexts tend to require very precise meanings in order to minimize ambiguity.

Heidi Cool has a nice entry where she talks about the importance of being aware of when you are using specialized vocabulary, and the need to know your audience when speaking or writing, so that some of the pitfalls arising from the imprecise use of words can be avoided.

We have to realize though that despite our best efforts, we can never be sure that the meaning that we intend to convey by our words is the same as the meaning constructed in the minds of the reader or listener. Words always contain an inherent ambiguity that allows the ideas expressed by them to be interpreted differently.

I used to be surprised when people read the stuff I wrote and got a different meaning than I had intended. No longer. I now realize that there is always some residual ambiguity in words that cannot be overcome. While we can and should strive for maximum precision, we can never be totally unambiguous.

I agree with philosopher Karl Popper when he said, “It is impossible to speak in such a way that you cannot be misunderstood.” The best we can hope for is to have some sort or negotiated consensus on the meanings of ideas.

POST SCRIPT: Huckabee and Paul

Alexander Cockburn discusses why Mike Huckabee and Ron Paul are the two most interesting candidates on the Republican side.

Reflections on writing the posts on evolution and the law

When I started out to write the series of posts on evolution and the law, I originally intended it to be about ten posts in all, divided roughly equally between the Scopes trial, the Dover trial, and the period of legal evolution in between them. As those readers who have stayed with the series are painfully aware, the subject matter carried me away and the final result is much longer.

Part of the reason is that I always intend my blog posts to have some useful and reliable information and not just be speculative rants (though those can be fun), which meant that I needed to research the subject. Fortunately, I love the subject of constitutional law because it as a spin-off of my interest in how one creates a just society. If one traces people’s constitutional protections to their source, they tend to be rooted in questions about power and control, the nature of liberty, about who gets to make decisions that govern all of us, and what constraints we impose on them.

As I started to research the subject more deeply, I became fascinated at the interplay of political, social, and religious factors surrounding the question of the role of public schools in a democratic society is and how we decide what should be taught in them. I could see that the legal history involved in the teaching of evolution in public schools was more complicated and fascinating than I had originally conceived.

I had two choices. I could close off some avenues of discussion and stick only to the main points. That would be like driving to some destination while sticking just to the highway in order for maximum speed. Or I could take some detours off the beaten track, to get a better flavor of the country I was passing through. I felt that the former option, while making for quicker reading, would result in posts that were a little too glib and not have enough supporting evidence for some of my assertions.

So I chose the latter option, feeling confident that those who read this blog tend to be those who are looking for at least some substantiation of arguments even if they disagree with my views.

The way these posts grew made me reflect on my philosophy of teaching as well. In my seminar courses, students have to write research papers on some topic. Usually a course requires two five-page papers and a final ten-page paper. Students have been through this drill of writing papers many times in many courses and they usually find that they do not have enough to say and struggle to fill what they see as a quota. They use some time-tested techniques: wide margins, large fonts and spacing, and when those things have reached their limit, unnecessary verbiage. Superfluous words and phrases are inserted, ideas are repeated, pointless examples and non sequitur arguments are brought in, and so forth.

The reason for this is that in most cases students are writing about things that they do not really care about and are just going through the motions to meet someone else’s needs, not their own. The result is painful for both the student (who has to construct all this padding without it being too obvious that that is what it is) and for the instructor (who has to cut through all the clutter to find out what the author is really trying to say). It is largely a waste of time for both, and often unpleasant to boot.

To help overcome this problem, I give my students as much freedom as possible to choose a research topic within the constraints of the overall course subject matter. I tell students that the most important thing they will do in the course is choose a topic that they care passionately about and want to learn more about. Once they do that, and start investigating and researching such a subject, it is almost inevitable that they will get drawn in deeper and deeper, like I was with evolution and the law.

Once they are on that road, the problem is not how to fill the required number of pages but how to cut it down so that you don’t exceed the page limits by too much. This has the added bonus of teaching students how to edit to tighten their prose, to use more judicious language, and to only keep those things that are essential to making their case.

The passion for the subject and the desire to know more about it is what makes genuine researchers carry out difficult and sometimes tedious tasks, because they really care about learning more.

The way this series of posts has grown is an example of this phenomenon at work. Because it is a blog without length restrictions, I have been able to indulge myself a bit. But if I had to restrict the length because of publication needs, then I would go back and do some serious pruning.

POST SCRIPT: The bullet trick

Penn and Teller do another of their famous tricks.