The value of the Hubble constant plays a crucial role in cosmology in calculating the age of the universe. The most direct way to obtain it is to measure the distances to stars and galaxies and galaxy clusters and map them against their speeds. The Hubble-Lemaitre law of cosmic expansion predicts that the resulting data should lie on a straight line and from the slope of that line one gets the value of the constant. But while obtaining the speeds of each stellar object is relatively easy using the red shift of light, measuring distances is a very tedious and painstaking business in which one has to use a ‘ladder method’ involving different techniques, starting with finding the distances to the objects closest to you and using those results to find the distances to the next distance set, often using a different method, and so on. This technique is highly prone to systematic errors because any error at any stage gets magnified as you go to more distant elements. The people who do this kind of work have to pay extremely close attention to detail and I salute them for their diligence.
Another way of determining the constant is to map the cosmic microwave background anisotropy and use that to find the best fit to a number of parameters, of which the Hubble constant is one. For those wanting to know more about how this is done, I discussed how this is done in a post from seven years ago. The value obtained by this method is the one commonly used to calculate the age of the universe and the currently much-quoted value of 13.71 billion years comes from that fit.
You would expect (or hope) that the different ways of measuring its value would converge to a single value. But this article points out that the different methods do not lead to such a convergence and the question is why. (I am a little embarrassed that I was not aware of this difference though if I had looked at the footnote #26 in the Astrophysical Constants Table, I would have learned about it a long time ago. Moral: Never ignore footnotes.)
In 2015 a huge team led by George Efstathiou of the University of Cambridge crunched microwave measurements from the European Space Agency’s Planck spacecraft and revealed the universe’s vital stats. Their results indicated the universe is expanding at a rate of 67.8 kilometers per second per megaparsec (a “megaparsec” being a unit of distance equal to 3.26 million light-years). Cosmologists typically drop that mouthful at the end and simply say that the Hubble constant is between 67 and 68.
Meanwhile competing groups of astronomers have been studying the expansion of the universe in a distinctly different way, by seeking out variable stars or supernova explosions of known distance and then directly measuring how quickly they are moving away from us. This “distance ladder” method is trickier than it sounds. Reckoning distances across many millions of light-years is a subtle, time-consuming task plagued with the possibilities for many kinds of systematic errors. Get the location of a star wrong, and the entire calculation goes awry.
By steadily beating down on the uncertainties and drawing on the latest observations of variable stars, her group has come up with its own high-precision answer for the constant: 73.2—and therein lies the controversy.
Cosmologists on both sides are also looking to outside groups for guidance. So far, those referees are only deepening the mystery. A University of California, Los Angeles, study that looks at how light is bent by distant galaxies gives a Hubble constant of 72.5, close to the distance-ladder result. Meanwhile an equally convincing study looking at how primordial sound waves affect the distribution of galaxies in the modern universe gives a constant of—you guessed it—67. Calculations of the Hubble constant anchored to the sound horizon consistently give a lower number than ones based on observations of stars and galaxies—and nobody knows why.
So what might be possible solutions? One is that the time at which the cosmic microwave radiation decoupled from matter is earlier by about 7% than the currently estimated age of 380,000 years. But you cannot just make ad hoc changes to solve one problem. You also have to suggest a mechanism and deal with making that change consistent with all the other cosmological data that is available and that usually requires invoking other auxiliary hypotheses. One mechanism that is being proposed is a new kind of substance in addition to dark matter and dark energy.
Needless to say, invoking a third new entity when the other two have not been detected as yet is not something to be taken lightly. As we often say in science, this requires more research, and various detection schemes are being proposed to address this anomaly.