Bad Astronomy | Universe expansion measurements show large deviations

There is a problem with the universe.

Or there is a problem with how we perceive it. Either way, something is fishy.

In a nutshell, the universe expands. There are a whole bunch of different ways to measure the expansion. The good news is that these methods are all available roughly same number for it. The bad news is that they do not get it exactly same number. One group of methods gets one number and another group another number.

This contradiction has been around for a while and it is not getting any better. In fact, it is getting worse (as astronomers like to say, there is a growing tension between the methods). The big difference between the two groups is that one set of methods looks at relatively nearby things in the Universe, and the other at very far. Either we are doing something wrong, or the Universe is doing something far beyond what is near here.

In a new article just published, a clever method is used to measure the expansion to galaxies in the environment, and what it finds is consistent with the other “nearby objects” methods. What can help or not.

OK, a little backup … we’ve known for a century or so that the universe is expanding. We see galaxies moving away from us all, and the farther away a galaxy is, the faster it seems to move. As far as we can see, there is a close relationship between the distance of a galaxy and how fast it appears to be moving away. So says a galaxy of 1 megaparsec* (abbreviated Mpc) can move away at 70 kilometers per second from us, and one twice as far (2 Mpc) moves twice as fast (140 km / sec).

This relationship seems to last over long distances, so we call it the Hubble Constant, or H0 (pronounced “H naughty”), after Edwin Hubble who was one of the first to propose this idea. It is measured in the odd units of kilometers per second per megaparsec (or velocity per distance – something moves faster when it is further away).

Methods that look at closer objects such as stars in nearby galaxies, exploding stars and so on, get H0 to be about 73 km / sec / Mpc. But methods that use more distant things, such as the cosmic microwave background and bario-acoustic oscillations, get a smaller number, more like 68 km / sec. / Mpc.

They are close, but they are not the same. And since the two methods all seem internally consistent, this is a problem. What is happening?

The new article uses a cool method fluctuations in surface brightness. It’s a flashy name, but it’s an idea that’s actually intuitive.

Imagine standing on the edge of a forest, right in front of a tree. Because you are so close, you only see one tree in your field of vision. Back up a bit and you may see more trees. Back up further and you can see even more.

Same with galaxies. Observe a nearby one with a telescope. In a given pixel of your camera you can see ten stars, all fading to the single pixel. Just because of statistics, another pixel can see 15 (it’s 50% brighter than the first pixel), another 5 (half as bright as the first pixel).

Now look at a galaxy that is the same in every way, but twice as far. In one pixel you can see 20 stars, and in others 27 and 13 (a difference of ~ 35%). At ten times the distance you see 120, 105 and 90 (a difference of about 10%) – note manner simplify it here and just make up numbers as an example. The point is that the farther away a galaxy is, the smoother the brightness distribution (the difference between pixels becomes smaller compared to the total in each pixel). Not only that, it’s smoother in a way that you can measure and assign a number.

In fact, it’s more complicated than that. If an galaxy is forming stars in one section, it throws away the numbers, so it’s best to look at elliptical galaxies, which have not made new stars in billions of years. The galaxy should be close enough to get good statistics, limiting it to those who are perhaps 300 million light-years away and closer. You also need to account for dust, and background systems in your images and clusters, and how galaxies have more stars at their centers, and … and … and …

But all of these things are known and fairly straightforward to correct.

When they do all this, the number they got for H0 was (drum roll …) 73.3 km / sec / Mpc (with an uncertainty of about ± 2 km / sec / Mpc) right in line with other nearby methods and very different from the other group using far methods.

In a way that is expected, but it again gives the idea that we are missing something important here.

All methods have problems, but uncertainty is quite small. Either we really underestimate these uncertainties (always possible, but at this point a little unlikely) or the Universe acts in a way we did not expect.

If I were to bet, I would go with the latter.

Why? Because it’s been done before. The universe is troublesome. Since the nineties we know that the expansion is no longer constant. Astronomers saw that exploding stars in the distance were always farther away than a simple measurement indicated, leading them to believe that the universe was now expanding faster than before, which in turn led to the discovery of dark energy – the mysterious entity. which is the Universal Extension.

When we look at distant objects, we see them as in the past, when the universe was younger. If the rate of expansion of the universe were different dan (say 12 – 13.8 billion years ago) as it is now (less than a billion years ago) we can get two different values ​​for H0. Or maybe different parts of the universe are expanding at different rates.

If the expansion rate has change that has profound implications. This means that the universe is not the age we think it is (we use the expansion rate to hold back the age), which means it has a different size, which means the time it takes for things to happen, is different. This means that the physical processes that took place in the early Universe took place at different times, and perhaps other processes are involved that affect the rate of expansion.

So yeah, it’s a mess. Either we do not understand how the universe behaves well enough, or we do not measure it properly. Either way, it’s a big pain. And we just do not know what it is.

This new paper makes it even more apparent that the contradiction is real, and the universe itself is to blame. But it is not conclusive. We need to keep doing this, keep breaking down the uncertainties, keep trying new methods, and hopefully at some point we will have enough data to point to something and say, ‘AHA!

It will be an interesting day. Our understanding of the cosmos will take a big leap when that happens, and then cosmologists will have to find something else to argue about. What they will do. It’s a big place, this Universe, and there’s enough to rave about.


* A parsec is a unit of length equal to 3.26 light-years (or 1 / 12th of a Boiler). I know it’s a strange unit, but it has a lot of historical significance and is linked to many ways we measure distance. Astronomers looking at galaxies like to use the distance unit of megaparsec, where 1 Mpc is 3.26 million light-years. It’s a little longer than the distance between us and the Andromeda galaxy.

.Source