One of the websites I enjoy visiting from time to time is Time Tree. It tells you how long ago any two branches of our evolutionary tree had a common ancestor. It gives the results for different models, the citations, and the average value.
So for example, if in the two boxes marked Taxon A and Taxon B on the top left I put in (say) Homo sapiens and dog, I get the answer for the average time as 94.4 million years ago, with the range from 57 to 116 million years. So that is when Baxter the Wonder Dog and I had a common ancestor. Humans and frogs had a common ancestor 371 million years ago, while for humans and mushrooms we have to go way back 1.218 billion years.
There is something remarkably bracing about knowing that I am biologically related, in a very tangible way, to every single organism that lives now and has ever lived, and being able to actually calculate when our paths diverged makes it feel even more real.
How do they calculate these times? One of the methods involves a ‘molecular clock’ that uses the rate at which DNA changes. Once we know that rate, then by looking at the difference in the DNA of two species, you can calculate how much time has elapsed. But you can’t use every part of the DNA to calculate the rate since different parts can change at different rates. As Matthew Cobb explains:
The basic assumption behind the molecular clock is that mutations – changes in DNA – occur at a constant rate over time, and that the number of differences between two groups can therefore be turned into a figure based on the time since the two diverged.
There are some important provisos to the clock – any stretch of DNA that is subject to selection, for example, is not going to be a very useful source of clock data, as genetic differences will tend to be removed by selection; many genes that are vital to organismal function are therefore highly conserved, showing few differences between groups. For this reason, scientists tend to use either ‘synonymous changes’ in DNA – these are ‘silent’ differences that do not cause any change in gene function (protein structure, gene regulation, or whatever) – or to use stretches of non-coding DNA, which appear to be not subject to natural selection and to evolve ‘neutrally’, just accumulating mutations with time.
Now a new study finds that the rate of change that we used previously may have been a little too fast, at least during the period of human evolution. As Ann Gibbons explains in the 12 October 2012 issue of the journal Science
Now it seems that the molecular clock ticks more slowly than anyone had thought, and many dates may need to be adjusted. Over the past 3 years, researchers have used new methods to sequence whole human genomes, allowing them to measure directly, for the first time, the average rate at which new mutations arise in a newborn baby. Most of these studies conclude that the mutation rate in humans today is roughly half the rate that has been used in many evolutionary studies since 2000.
Remarkably, all the studies got about the same rate: 1.2 × 10-8 mutations per generation at any given nucleotide site. That’s about 1 in 2.4 billion mutations per site per year (assuming an average generation time of 29 years)—and that’s less than half of the old, fossil-calibrated rate.
A slower-ticking molecular clock also has major implications for evolution. For example, the slow clock suggests that the ancestors of modern humans and Neandertals diverged about 400,000 to 600,000 years ago, rather than 272,000 to 435,000 years ago. This fits nicely with fossils of Homo heidelbergensis, which date between 350,000 and 600,000 years ago and are thought to be ancestral to Neandertals. Scally and Durbin also revised the dates for modern human evolution, such as pushing back the timing of a dramatic population bottleneck from 100,000 to 120,000 (rather than 60,000 to 80,000) years ago, and the emergence of modern humans out of Africa to 90,000 to 130,000 (rather than less than 70,000) years ago.
The problem is that assuming a constant rate of change for all species at all times creates problems with elements of the fossil records and generates some results that are hard to corroborate, such as that humans and orangutans split about 40 million years ago, when other evidence outs the figure at around 9 to 13.9 million years ago. (Time Tree puts the range between 8.5 and 22 million years ago with a mean of 15.7 million years.)
Recognizing these problems, Scally and Durbin propose a fix: They endorse a 30-year-old idea that the mutation rate was faster early in primate evolution, then slowed in the African apes, and perhaps slowed even more in human evolution—the so-called hominoid slowdown. This idea is backed by evidence that as body size gets bigger in various mammals, their generation times get longer, slowing per-year mutation rates, Scally says.
It should be borne in mind that these new studies only deal with the period after humans appeared on the scene. But if a similar change in the molecular clock is discovered for even earlier periods then my common ancestor with Baxter the Wonder Dog will get pushed back further in time. But it will not matter to me.
Gibbons’s article is fascinating and well worth reading.