How old are you?

In an article in the New York Times, Nicholas Wade points out that our bodies are younger than we think, because there is a discrepancy between our birth age and the age of the cells that make up our bodies

Whatever your age, your body is many years younger. In fact, even if you’re middle aged, most of you may be just 10 years old or less.

This heartening truth, which arises from the fact that most of the body’s tissues are under constant renewal, has been underlined by a novel method of estimating the age of human cells. Its inventor, Jonas Frisen, believes the average age of all the cells in an adult’s body may turn out to be as young as 7 to 10 years.

He quotes the work of Spalding, Bhardwaj, Buchhold, Druid, and Frisén of the Karolinska institute that uses the radioactive isotope carbon-14 to determine the age of the cells in bodies. Their paper appeared in the July 15, 2005 issue of Cell. They used carbon-14 dating to determine the age of cells. The carbon that forms organic matter is largely obtained from the atmosphere. Plants, for example, take in carbon dioxide from the air and exude oxygen as part of the process of photosynthesis. Hence the proportion of carbon-14 that is found in living organic matter is the same as that in the ambient atmosphere at the time it was absorbed. The level of the radioactive isotope carbon-14 that occurs in the atmosphere is fairly constant because its rate of production is balanced by the rate of decay. Once the plant dies, it does not take in any new carbon and the decay of the carbon-14 that it had at the moment of death results in a steadily smaller proportion of it and the difference can be used to measure how long it has been dead. The half-life of carbon-14 is 5,730 years and this method can be used to determine the age of dead organic matter up to about 50,000 years, which is a convenient range for archeological dating because it lies in the range required for those studies.

The way that Frisén and his co-workers used this knowledge to measure the age of cells in humans is quite clever. Carbon-14 is produced by cosmic rays and the level of carbon-14 in the atmosphere should be constant. This is why we can tell how long something has been dead but not when it was ‘born’, i.e., when the organic matter was created. But in the 1950s and 1960s, there was a sharp spike in carbon-14 levels because of the atmospheric testing of nuclear weapons. Once atmospheric test ban treaties came into came into being, the surge of carbon-14 that had been produced steadily became diffused in the atmosphere as it spread over the globe, and so there has been a steady decline in average carbon-14 levels over time. It is this that enables us to know when the carbon-14 was absorbed to create organic matter.

cell images_Page_02_Image_0001.jpg

The amount of carbon-14 in the genomic DNA can thus be used to measure when the DNA in the cell was created. The technique was checked against the age of trees which can be measured by the amounts of carbon-14 found in the various rings as the isotope is absorbed during photosynthesis. Their results and those of others show that different parts of the body get replaced after different durations, whose approximate values are given below. (I have included results from both the Wade newspaper article and the Frisen paper.)

Stomach lining: five days
Surface layer of skin: two weeks
Red blood cells: three months
Liver: one year
Skeleton: 10 years
Intestine: 11 years
Rib muscles: 15 years

This explains why our bodies seem so durable and able to withstand considerable abuse. [UPDATE: Later studies find that about 50% of our heart muscles are replaced over our lifetime, but the brain cells seem to be largely unchanged.]

So why do we die if parts of us keep getting regenerated? It seems as if the ability of stem cells to keep reproducing declines with age. In other words there seems to be a limit to the number of times that cells can reproduce and once we reach that limit, the ability of the body to regenerate itself ceases. What causes this limit is still an open question. As Wade writes:

Some experts believe the root cause is that the DNA accumulates mutations and its information is gradually degraded. Others blame the DNA of the mitochondria, which lack the repair mechanisms available for the chromosomes. A third theory is that the stem cells that are the source of new cells in each tissue eventually grow feeble with age.

Frisen thinks his research might be able to shed some light on this question, especially the third option, saying “The notion that stem cells themselves age and become less capable of generating progeny is gaining increasing support.”

Who should own the rights to one’s tissues?

People generally do not think about what happens to the blood and tissue samples they give as part of medical tests, assuming that they are eventually discarded in some way. Many are not aware that your samples may be retained for research or even commercial purposes. Once you give it away, you lose all rights to what is subsequently done with it, even if your body parts have some unique property that can be used to make drugs and other things that can be marketed commercially.

The most famous case of this is Henrietta Lacks, a poor black woman in Baltimore who died from cervical cancer in 1951. A researcher who had been trying unsuccessfully, like others, to have cells reproduce in the test tube, received a sample of hers too. It turned out that her cancer cells, unlike other cells, could reproduce endlessly in test tubes, providing a rich and inexhaustible source of cells for research and treatment. Her cells, called HeLa, have taken on a life of their own and have travelled the world long after she died. Her story is recounted in the book The Immortal Life of Henrietta Lacks by Rebecca Skloot.

The issue of whether one’s cells should be used without one’s permission and whether one should be able to retain the rights to one’s tissues is a tricky one for law and ethics.

“Science is not the highest value in society,” [Lori Andrews, director of the Institute for Science, Law, and Technology at the Illinois Institute of Technology] says, pointing instead to things like autonomy and personal freedom. “Think about it,” she says. “I decide who gets my money after I die. It wouldn’t harm me if I died and you gave all my money to someone else. But there is something psychologically beneficial to me as a living person to know I can give my money to whoever I want. No one can say, ‘She shouldn’t be allowed to do that with her money because that might not be most beneficial to society.’ But replace the word money in that sentence with tissue, and you’ve got precisely the logic many people use against giving donors control over their tissues.” (Skloot, p. 321)

It does seem wrong somehow for private companies to hugely profit from the lives and bodies of others without owing them anything. In the case of Henrietta Lacks, her family remained very poor and lacked health insurance and proper medical care even while her cells became famous and they bitterly resented this. They did not even know about the widespread use of her cells until two decades later.

On the other hand, it would put a real crimp on research if scientists had to keep track of whose tissues they were working on. Since we all benefit (or should benefit) from the results of scientific research, one can make the case that the tissues we give up are like the trash we throw away, things for which we have voluntarily given away our rights. If the tissues are used for medical research done by public institutions like the NIH or universities and the results are used not for profit but to benefit the general public, this would, I believe, remove many of the objections to the unaccredited use of tissues.

You can see why scientists would prefer to have the free use of tissues but what I don’t understand are those scientists who go overboard in making special exceptions for religion.

David Korn, vice president for research at Harvard University says: “I think people are morally obligated to allow their bits and pieces to be used to advance knowledge to help others. Since everybody benefits, everybody can accept the small risks of having their tissue scraps used in research. “The only exception he would make is for people whose religious belief prohibit tissue donation. “If somebody says being buried without all their pieces will condemn them to wandering forever because they can’t get salvation, that’s legitimate, and people should respect it,” Korn says. (Skloot, p. 321)

This is another case where religions try to claim special privileges denied to everyone else. Why is that particular claim legitimate? Why should religious superstitions get priority over other irrational beliefs? Our bodies are in a constant state of flux. It sheds cells all the time in the normal course of our daily lives, which is why DNA testing has become such a valuable forensic tool for solving crimes. Since we are losing old cells and gaining new cells all the time, it is a safe bet that hardly any of the cells that were part of me as a child are still in my body. So the whole idea that the afterlife consists of ‘all of me’ is absurd since that would require bringing together all the cells that I have shed during my life, resulting in me having multiple organs and limbs, like some horror fiction monster.

Rather than pandering to this fantasy, we should educate people that our bodies are in a constant state of flux, that our seemingly permanent bodies are actually transient entitites.

Atheism is a byproduct of science

Science is an atheistic enterprise. As the eminent population geneticist J. B. S. Haldane said:

My practice as a scientist is atheistic. That is to say, when I set up an experiment I assume that no god, angel or devil is going to interfere with its course; and this assumption has been justified by such success as I have achieved in my professional career. I should therefore be intellectually dishonest if I were not also atheistic in the affairs of the world.

While not every scientist would apply the highly successful atheistic methodology to every aspect of their lives as Haldane does, the fact that intellectual consistency requires it, coupled with the success of science, has persuaded most scientists that leaving god out of things is a good way to proceed and hence it should not be surprising that increasing awareness of science correlates with increased levels of atheism.

But it would be wrong to conclude that scientists have atheism as a driving concern in their work or that they actively seek out theories that deny the existence of god. God is simply irrelevant to their work. The negative implications for god of scientific theories is a byproduct of scientific research rather than the principle aim of it. Non-scientists may be surprised that discussions about god are almost nonexistent at scientific meetings and even in ordinary interactions among scientists. We simply take it for granted that god plays no role whatsoever.

For example, the idea of the multiverse has torpedoed the argument of religious people that the universe must have had a beginning or that its parameters seem to be fine-tuned for human life, which they argue are evidences for god. They seem suspicious that the multiverse idea was created simply to eliminate god from these two of the last three refuges in which he could be hiding. (The third refuge is the origin of a self-replicating molecule that was the precursor of life.) In his article titled Does the Universe Need God?, cosmologist Sean Carroll dismisses that idea.

The multiverse is not a theory; it is a prediction of a theory, namely the combination of inflationary cosmology and a landscape of vacuum states. Both of these ideas came about for other reasons, having nothing to do with the multiverse. If they are right, they predict the existence of a multiverse in a wide variety of circumstances. It’s our job to take the predictions of our theories seriously, not to discount them because we end up with an uncomfortably large number of universes.

Carroll ends with a nice summary of what science is about and why god really has no reason to be postulated into existence. This is similar to the points I made in my series on why atheism is winning.

Over the past five hundred years, the progress of science has worked to strip away God’s roles in the world. He isn’t needed to keep things moving, or to develop the complexity of living creatures, or to account for the existence of the universe. Perhaps the greatest triumph of the scientific revolution has been in the realm of methodology. Control groups, double-blind experiments, an insistence on precise and testable predictions – a suite of techniques constructed to guard against the very human tendency to see things that aren’t there. There is no control group for the universe, but in our attempts to explain it we should aim for a similar level of rigor. If and when cosmologists develop a successful scientific understanding of the origin of the universe, we will be left with a picture in which there is no place for God to act – if he does (e.g., through subtle influences on quantum-mechanical transitions or the progress of evolution), it is only in ways that are unnecessary and imperceptible. We can’t be sure that a fully naturalist understanding of cosmology is forthcoming, but at the same time there is no reason to doubt it. Two thousand years ago, it was perfectly reasonable to invoke God as an explanation for natural phenomena; now, we can do much better.

None of this amounts to a “proof” that God doesn’t exist, of course. Such a proof is not forthcoming; science isn’t in the business of proving things. Rather, science judges the merits of competing models in terms of their simplicity, clarity, comprehensiveness, and fit to the data. Unsuccessful theories are never disproven, as we can always concoct elaborate schemes to save the phenomena; they just fade away as better theories gain acceptance. Attempting to explain the natural world by appealing to God is, by scientific standards, not a very successful theory. The fact that we humans have been able to understand so much about how the natural world works, in our incredibly limited region of space over a remarkably short period of time, is a triumph of the human spirit, one in which we can all be justifiably proud.

Religious believers misuse this fundamental nature of scientific inquiry, that all conclusions are tentative and that what we believe to be true is a collective judgment made by comparing theories and determining which one is best supported by evidence, to make the misleading case that unless we have proved one single theory to be true, other theories (especially the god theory) should merit serious consideration. This is wrong. While we may not be able to prove which theories are right and which are wrong, we do know how to judge which ones are good and which ones are bad.

God is a terrible theory. It fails utterly to deliver the goods, and so should be abandoned like all the other failed theories of the past. In the film Love and Death, Woody Allen’s character says, “If it turns out that there is a god, I don’t think that he’s evil. I think that the worst you can say about him is that basically he’s an underachiever.” He is right.

God is not the ‘simplest’ explanation for the universe

Believers in god (especially of the intelligent design variety) like to argue that a god is a ‘simpler’ explanation than any of the alternatives for many natural phenomena. But they seem to equate simple with naïve, in the sense that what makes something simple is something that should be understandable by a child. For example, if a child asks you why the sun rises and sets every day, giving an explanation in terms of the laws of gravity, Newton’s laws of motion, and the Earth’s rotation about its own axis, is not ‘simple’. A child would more likely understand an explanation in which there is a man whose job it was to push the sun around in its daily orbit. This is ‘simpler’ because the concepts of ‘man’ and ‘push’ are familiar ones to a child, requiring no further explication. But this apparent simplicity is an illusion because it ignores enormously complicating factors such as how the man got up there, how strong must he be, why don’t we see him, and so on. It is because such issues are swept under the rug that this explanation appears to be simple.

In his article titled Does the Universe Need God?, cosmologist Sean Carroll points out that introducing a new ad hoc element like god into a theory actually makes things enormously complicated. The erroneous idea that simplicity is linked to the number of entities involved is based on a misconception of science.

All else being equal, a simpler scientific theory is preferred over a more complicated one. But how do we judge simplicity? It certainly doesn’t mean “the sets involved in the mathematical description of the theory contain the smallest possible number of elements.” In the Newtonian clockwork universe, every cubic centimeter contains an infinite number of points, and space contains an infinite number of cubic centimeters, all of which persist for an infinite number of separate moments each second, over an infinite number of seconds. Nobody ever claimed that all these infinities were a strike against the theory.

The simplicity of a theory is a statement about how compactly we can describe the formal structure (the Kolmogorov complexity), not how many elements it contains. The set of real numbers consisting of “eleven, and thirteen times the square root of two, and pi to the twenty-eighth power, and all prime numbers between 4,982 and 34,950” is a more complicated set than “the integers,” even though the latter set contains an infinitely larger number of elements. The physics of a universe containing 1088 particles that all belong to just a handful of types, each particle behaving precisely according to the characteristics of its type, is much simpler than that of a universe containing only a thousand particles, each behaving completely differently.

At first glance, the God hypothesis seems simple and precise – an omnipotent, omniscient, and omnibenevolent being. (There are other definitions, but they are usually comparably terse.) The apparent simplicity is somewhat misleading, however. In comparison to a purely naturalistic model, we’re not simply adding a new element to an existing ontology (like a new field or particle), or even replacing one ontology with a more effective one at a similar level of complexity (like general relativity replacing Newtonian spacetime, or quantum mechanics replacing classical mechanics). We’re adding an entirely new metaphysical category, whose relation to the observable world is unclear. This doesn’t automatically disqualify God from consideration as a scientific theory, but it implies that, all else being equal, a purely naturalistic model will be preferred on the grounds of simplicity.

Religious people think that god is a ‘simpler’ theory because they give themselves the license to assign their god any property they wish in order to ‘solve’ any problem they encounter, without making the answer given in one area consistent with an answer given elsewhere. But the very fact that the god model is so malleable is what makes it so useless. For example, religious people will argue (as they must) that the way that the world currently exists, despite the suffering, disasters, and catastrophes that seem to afflict everyone indiscriminately, is evidence for a loving god. A colleague of mine who is a very thoughtful and sophisticated person told me recently that when he looks at the world, he sees one that is consistent with the existence of god.

This raises two questions. The first is whether the world that he sees also consistent with the non-existence of god. If yes, how does he decide which option to believe? If no, what exactly is the source of the inconsistency?

The second question is what the world would need to look like for him to conclude that the there is no god. Carroll gives a thought experiment that illustrates the shallowness of those who argue that the evils and misfortunes and calamities that bestride this world are actually evidence for god.

In numerous ways, the world around us is more like what we would expect from a dysteleological set of uncaring laws of nature than from a higher power with an interest in our welfare. As another thought experiment, imagine a hypothetical world in which there was no evil, people were invariably kind, fewer natural disasters occurred, and virtue was always rewarded. Would inhabitants of that world consider these features to be evidence against the existence of God? If not, why don’t we consider the contrary conditions to be such evidence?

It is not hard to understand why the concept of god could only have arisen in primitive, or at least pre-modern, times.

Consider a hypothetical world in which science had developed to something like its current state of progress, but nobody had yet thought of God. It seems unlikely that an imaginative thinker in this world, upon proposing God as a solution to various cosmological puzzles, would be met with enthusiasm. All else being equal, science prefers its theories to be precise, predictive, and minimal – requiring the smallest possible amount of theoretical overhead. The God hypothesis is none of these. Indeed, in our actual world, God is essentially never invoked in scientific discussions. You can scour the tables of contents in major physics journals, or titles of seminars and colloquia in physics departments and conferences, looking in vain for any mention of possible supernatural intervention into the workings of the world.

The concept of god is a relic of our ancient history, like the vestigial elements of animal physiology such as the legs bones of some snakes, the small wings of flightless birds like the kiwi, the eyes of the blind mole rat, and the tailbone, ear muscles, and appendix of humans. It will, like them, eventually disappear for the same reason, because they have ceased to be of use.

The failure of fine-tuning arguments for god

When I ask people why they believe in god, their response almost invariably comes down to them being impressed with the complexity of the world and thinking that it could not have come about without some intelligent agent behind it. It is highly likely that this ‘reason’ is not the actual cause of their belief but a later rationalization for beliefs that they unthinkingly adopted as part of their childhood indoctrination into religion. When people become adults, they realize that saying they believe something because they were told it as children is likely to expose them to ridicule, and so they manufacture a superficially more rational answer.
[Read more…]

Why a god is not necessary to create the universe

In an article titled Does the Universe Need God?, cosmologist Sean Carroll provides a rejoinder to those who would try to squeeze god in as an answer to what they perceive as unexplained gaps in our knowledge. It is a long article that is worth reading in full but for those who lack the time, I will excerpt some of the key points.

He starts by making the same point that I made in the series Why atheism is winning, that the long-term outlook for religion is extremely bleak because science and its associated modernistic outlook is making it irrelevant in ways that are hard to ignore even by the most determined religionist.

Most modern cosmologists are convinced that conventional scientific progress will ultimately result in a self-contained understanding of the origin and evolution of the universe, without the need to invoke God or any other supernatural involvement. This conviction necessarily falls short of a proof, but it is backed up by good reasons. While we don’t have the final answers, I will attempt to explain the rationale behind the belief that science will ultimately understand the universe without involving God in any way.

Those who want to insert god somewhere, to show that he/she/it is necessary in some way, need to realize that they have at most a window of one second just after the Big Bang to work with.

While we don’t claim to understand the absolute beginning of the universe, by the time one second has elapsed we enter the realm of empirical testability. That’s the era of primordial nucleosynthesis, when protons and neutrons were being converted into helium and other light elements. The theory of nucleosynthesis makes precise predictions for the relative abundance of these elements, which have passed observational muster with flying colors, providing impressive evidence in favor of the Big Bang model. Another important test comes from the cosmic microwave background (CMB), the relic radiation left over from the moment the primordial plasma cooled off and became transparent, about 380,000 years after the Big Bang. Together, observations of primordial element abundances and the CMB provide not only evidence in favor of the basic cosmological picture, but stringent constraints on the parameters describing the composition of our universe.

He then clarifies what it means to talk about the Big Bang event, a singular event in time, as distinct from the Big Bang model that is the working out of the aftermath of that event.

One sometimes hears the claim that the Big Bang was the beginning of both time and space; that to ask about spacetime “before the Big Bang” is like asking about land “north of the North Pole.” This may turn out to be true, but it is not an established understanding. The singularity at the Big Bang doesn’t indicate a beginning to the universe, only an end to our theoretical comprehension. It may be that this moment does indeed correspond to a beginning, and a complete theory of quantum gravity will eventually explain how the universe started at approximately this time. But it is equally plausible that what we think of as the Big Bang is merely a phase in the history of the universe, which stretches long before that time – perhaps infinitely far in the past. [My italics] The present state of the art is simply insufficient to decide between these alternatives; to do so, we will need to formulate and test a working theory of quantum gravity.

The problem with “creation from nothing” is that it conjures an image of a pre-existing “nothingness” out of which the universe spontaneously appeared – not at all what is actually involved in this idea. Partly this is because, as human beings embedded in a universe with an arrow of time, we can’t help but try to explain events in terms of earlier events, even when the event we are trying to explain is explicitly stated to be the earliest one. It would be more accurate to characterize these models by saying “there was a time such that there was no earlier time.”

To make sense of this, it is helpful to think of the present state of the universe and work backwards, rather than succumbing to the temptation to place our imaginations “before” the universe came into being. The beginning cosmologies posit that our mental journey backwards in time will ultimately reach a point past which the concept of “time” is no longer applicable. Alternatively, imagine a universe that collapsed into a Big Crunch, so that there was a future end point to time. We aren’t tempted to say that such a universe “transformed into nothing”; it simply has a final moment of its existence. What actually happens at such a boundary point depends, of course, on the correct quantum theory of gravity.

The important point is that we can easily imagine self-contained descriptions of the universe that have an earliest moment of time. There is no logical or metaphysical obstacle to completing the conventional temporal history of the universe by including an atemporal boundary condition at the beginning. Together with the successful post-Big-Bang cosmological model already in our possession, that would constitute a consistent and self-contained description of the history of the universe.

Nothing in the fact that there is a first moment of time, in other words, necessitates that an external something is required to bring the universe about at that moment. [My italics]

The Big Bang event itself does not necessarily imply that the universe had a beginning in time and even if it should turn out that it had, it does not imply a beginner. This strikes at the heart of the arguments of religious apologists who need a beginning to make their claim say that a beginning necessarily implies a beginner. That argument is weak to begin with, but is the main one they have for god.

Religious people know that this conclusion is a devastating one for them. After all, if no god is required to create the universe, then he is truly an unnecessary concept. So they will fight or ignore or obfuscate this point with theological jargon.

Update on free will

Readers may recall my multi-part series on free will in which, among other things, I reported on the pioneering 1983 experiments of Benjamin Libet. Peter Hankins reviews a recent paper that uses latest developments that have been made possible by more recent sophisticated technology that can look at the activity of individual neurons in the brain. The researchers get results that essentially validate Libet’s conclusions and provide further insights. Hankins explains what it might all mean.

Inattentional deafness

I have long been intrigued by the fact that when I am absorbed in reading, I completely miss what people have said, even if they have been speaking directly to me. This can be embarrassing but in my case people tend to indulgently excuse it because of the stereotype of the ‘absent minded professor’. Being a theoretical physicist also helps since we are considered to be a little weird anyway.
[Read more…]

The McGurk effect

Blog reader Henry sent me the link to this clip from the BBC program Horizon of what is known as the McGurk effect, that shows that when the brain receives two different inputs, one aural and one visual, the brain forces you to register just one. Lawrence Rosenblum of the University of California, Riverside explains this effect and demonstrates how in this particular case the visual overrides the sound.

If we cannot do such a simple act of multitasking, imagine how unlikely it is that we can do more complex and challenging multitasking.

The motives of the Templeton Foundation

The June 21, 2010 issue The Nation has a good article by Nathan Schneider titled God, Science and Philanthropy that looks at the work of this wealthy foundation that dangles generous grants and a cash prize every year that is larger than the Nobel prize that goes, as Richard Dawkins says, “usually to a scientist who is prepared to say something nice about religion.”

Along with providing support for politically right-wing organizations, the foundation’s goal seems to be to lure scientists to sign on to the idea that science and religion are compatible. Nobel prize winning chemist Harold Kroto is one of those fighting back against it and says of the foundation that “They are involved in an exercise that endangers the fundamental credibility of the scientific community.”