The age and fate of the universe


Corey S. Powell provides a history of the Hubble-LeMaitre law and the efforts to pin down a precise value of the Hubble constant that plays a significant role in determining not only the age of the universe but also its ultimate fate. Like the age of the Earth, the value of the Hubble constant, and thus the age of the universe, has shifted considerably over time.

But the latest puzzle is that two different methods of measuring its value have resulted in two fairly precise values that do not overlap with each other, which is of course problematic. Attempts at reconciling these two values have resulted in suggestions that some properties of dark energy may be responsible for the difference.

Following in the tradition of Edwin Hubble, [Adam] Riess and his collaborators are observing stars in neighbouring galaxies to measure the Hubble Constant, with the ambitious goal of pinning down the number to an accuracy of 1 per cent. His research is zeroing in on a value of 73. Today there is another, entirely separate way to measure the Hubble Constant, by analysing subtle patterns etched into the cosmic microwave background, detected by the Planck space telescope. This approach gives an equally precise-looking answer of 67. The disagreement, though tiny by historical standards, is unnerving enough that cosmologists have started calling it the Hubble tension.

Strictly speaking, the two sides are not measuring the same thing. Riess is looking at the expansion of the nearby Universe, at relatively modern times. The Planck telescope measures effects of expansion long ago, shortly after the Big Bang, and then researchers derive a modern value of the Hubble Constant from that measurement. One way to reconcile the two is to suppose that the very early Universe was expanding slightly faster than expected. ‘It could be that there is something funky about dark energy being stronger than we thought,’ Riess says. ‘I don’t think it’s introducing something new to say: “What if dark energy is weird?” because there’s no such thing as it not being not weird.’

This case illustrates why increasing the precision of measurements often results in new problems that in turn require the invention of new theories to resolve them. This happens a lot in science.

Comments

  1. Jenora Feuer says

    This is something that Ethan Siegel of ‘Starts With a Bang’ (who’s also originally from scienceblogs) talks about somewhat often. Every method of measurement falls into one of the two basic categories, and they have long since got to the point where the error bars don’t overlap anymore. Somebody’s assumptions somewhere must be wrong; we just don’t know which ones yet.

  2. file thirteen says

    Just last year astronomers calculated a Hubble constant value of 69.8 km/s/Mpc using the Hubble space telescope to look at red giants, halfway between the other two measurements. This muddies the waters even more. It wouldn’t surprise me if all three measurements had unknown confounders making the margins of error much higher than estimated. A megaparsec is such an unfathomably huge distance.

  3. Jenora Feuer says

    The biggest problem with MOND is that it gets into ‘epicycles’ territory a lot faster than dark matter, where you have to start manually fine-tuning the theory to match the observations.

    In particular, it’s a lot easier to explain gravitational lensing along the paths of colliding galaxies by assuming there’s some unseen and largely intangible mass that continued moving along those paths when the visible gas clouds of the galaxies ran into each other and slowed down, than it is to tweak gravity to explain it.

    Sure, we have no idea what dark matter is (aside from it almost certainly not being neutrinos) and no direct detection method. It’s still less of a change to the theory.

  4. file thirteen says

    MOND is a very strange theory, that gravity might work differently on a cosmic scale.

    Dark matter is even weirder. I balk at the idea of large amounts of something completely undetectable hanging around just in order to balance the equations. As time goes by and people spend a lot of fruitless effort looking for the nonexistent signs of nothing, parallels with the searches for phlogiston and the ether spring to mind.

    So, back to MOND. But I doubt that you could even come up with equations that satisfy relativity merely by tweaking Newton’s laws. What is needed is a new Einstein.

  5. Dunc says

    I don’t understand why people have such a problem with DM… Pretty much the whole history of particle physics consists of people proposing currently undetectable particles in order to balance the equations, then trying to figure out ways to detect them (often without any success for decades), and it’s been extremely successful so far. In what way is DM so much worse than neutrinos, or the Higgs boson?

    As i understand it, even the proponents of MOND acknowledge that it can’t explain all of the obsevations and that you’d still need some kind of DM anyway.

  6. rrutis1 says

    file thirteen says @#3 “A megaparsec is such and unfathomably huge distance”…don’t you mean time? 😉

  7. file thirteen says

    @rrutis1 #8

    The sec in parsec relates to its definition in terms of arcseconds (as in 1/3600 of a degree), not seconds of time

    @Dunc #7

    It’s true that both neutrinos and the Higgs were theorised before their discovery, and neutrinos are perhaps the best evidence that something like dark matter might exist, on the account of their extremely low reactivity. However as yet there seems to be no rhyme or reason to the “distribution” of dark matter other than the assertion that something needs to be there to make the gravity equations work. But other than that we have no knowledge of what it might be, where it came from, why we don’t have any nearby, why some galaxies need heaps of it and some none at all, and why we can’t detect it in any way. I would be a lot less skeptical if we had any evidential answers to those questions.

  8. file thirteen says

    Edits to #9, because I can’t edit previous posts:
    -- I meant in terms of a single arcsecond, not arcseconds. An astronomical unit is roughly the distance from the earth to the sun. A parsec is the distance at which if you saw something one AU long, it would take up one arcsecond of the sky, 3.6 light years from memory. A light year is the distance light travels in a year. A megaparsec is a million parsecs. Yes, when you look at something a megaparsec away you see it as it was millions of years ago, but it’s still a unit of distance, not time.
    -- I meant on account of, not on the account of, ugh

Leave a Reply

Your email address will not be published. Required fields are marked *