Is this an accurate summary of the current status quo?
1. The universe is expanding
2. The velocity at which the universe is expanding is increasing, since there is non-zero acceleration.
3. The acceleration is actually slowing down, but it is not heading towards zero. It is approaching some steady state acceleration.
----------
In Exploring Black Holes by Edwin Taylor and John Wheeler, published in 2000, I recently read the following passage by Wheeler:
> An article by John Noble Wilford in the Science Times section of the New York Times for Tuesday March 3, 1998, reports observations by two separate groups of investigators which they interpret as showing that today the expansion of the Universe is speeding up rather than undergoing the slowdown expected for any approach to maximum expansion. Later that day I encountered a hard-bitten veteran gravitation physics colleague in the elevator of the Princeton physics building and asked him if he believed the purported evidence of accelerating expansion. "No," he replied. Neither do I. Why not? Two reasons: (1) Because the speed-up argument relies too trustingly on the supernovas being standard candles, (2) Because such an expansion would, it seems to me, contradict a view of cosmology too simple to be wrong. Such clashes between theory and experiment have often triggered decisive advances in physics. We can hope that some decisive advance is in the offing.
Now, Wheeler was one of the most preeminent physicists of the 20th century and spearheaded several fields and was not afraid of wild ideas. I am curious if anyone knows if he had any inklings that further his thoughts here. Was he just wrong here, before new measurements came out? Or is there something missing in these calculations and the theory that we're missing? I'm also not sure what means in (1).
> 3. The acceleration is actually slowing down, but it is not heading towards zero. It is approaching some steady state acceleration.
No. The scale factor in LCDM (the current concordance model) is proportional to sinh^(2/3)(t/t_Lambda), which means it's asymptotically exponential. Acceleration will always increase.
(Edit: The expansion rate, so a'/a, will go to a constant value, though.)
Caveat: If the Hubble tension proves real, LCDM is wrong.
To the Wheeler question, I would say he was just wrong. Pre-eminent physicist of the 20th century he may have been, smarter that I'll ever be, but he was also nearly 90 at that point and had retired 22 years earlier.
To (1), he's talking about the distance ladder. Distances to various galaxies were computed by the teams involved in developing the LCDM model by scouring the skies for instances of type IA supernovae in this galaxies. When these are called "standard candles," what that means is that the luminosity is known and thus the distance can be computed using the inverse square law by comparing the observed brightness to the known luminosity. Historically, this wasn't a super-reliable method and yielded some bounded error rate for various reasons, including dust causing some reddening of the light that could be mistaken for redshift, as well as the fact that type IA supernovae are not all collapsing stars of exactly equal mass and you might catch them at different times after the initial supernova.
What Wheeler probably didn't realize at that point was the methods and frankly just downright tedious repeated measurement work over the course of decades that astronomers had been doing to make this more reliable and accurate. If you want to know all about this via first-hand account from one of the astronomers credited with the dark energy discovery, read The Extravagant Universe by Robert Kirschner. It's very accessible, not super-technical, but goes into painstaking detail on exactly how astronomers made the type IA work as a standard candle.
It's also just a great testament to the work involved in experimental astronomy, down to the logistics of convincing a scheduling committee for the various telescopes capable of seeing that far that what you're looking for is worth looking for.
Possibly a dumb question: can the tension between the two rates simply be resolved because one method measures an old historical rate and one method measures the current rate, and it’s been dropping over time? Or are they both measuring the same time periods?
(I assume it’s not that simple, but it’s very hard for me to understand and reason about data that involves looking back in time and space simultaneously)
Both methods measure H_0, which is the Hubble parameter today. It's a bit confusing that it's sometimes called the Hubble constant, because it's not really a constant. The expansion rate H changes over time, so we write H(t) and by H_0 we mean H(t_today). It's probably the case that they didn't directly measure H_0 with the early universe probes, so they measured H at that time and then used the standard model to extrapolate this value to H_0, roughly speaking. This means, that if the measurements didn't suffer from some unknown systematic error, that the model is wrong. That's what they call "new physics" and could be the most exciting thing in cosmology since the 90s.
> Possibly a dumb question: can the tension between the two rates simply be resolved because one method measures an old historical rate and one method measures the current rate
This is already generally being accounted for (assuming we have the correct cosmological model for the Universe). The CMB measurement is probing the Universe at an early state (when the Universe was ~380,000 years old) and the Hubble "constant", H(t=380,000 yr), was different at that point from what it is now, H_0 = H(t=13.7 billion years). The comparison to the current Hubble constant (e.g., determined from Cepheids) is made by evolving the CMB Hubble "constant" value to today, under the assumption of our best cosmological model model.
Of course, if we have the wrong cosmological model, then that could be a reason for the discrepancy, and that might point to "new physics". Alternately there may be some systematic in the Cepheid technique that is causing our H_0 estimate to be somewhat off.
That idea is mentioned in the article, with the hypothesis that there was more dark energy in the past that has since dissipated. They label that idea as "New Physics" in the chart above that comment.
So does this require new physics to solve? What should the layman take away from this?
>distant galaxies have been speeding up in their recession, and the expansion rate, though still dropping, is not headed toward zero.
If the expansion rate is dropping, surely it is headed towards zero? Or are they using expansion rate to mean acceleration and the zero refers to the recession. Or am I misunderstanding something?
Imagine the expansion rate starts out at 2. Then 1 day later it's 1.5, after 1 more day it's 1.25 and after another day it's 1.125.
Hopefully you can see that if this series continues then the expansion rate is always dropping, but it's headed towards 1, not 0. (And if expansion rate of 1 is too confusing in this context imagine if it starts out at 3 and goes to 2.5, 2.25, 2.125, ..., it still is always decreasing, but it will never be less than 2 which means the universe keeps expanding).
Not necessarily. First the distance ladder could have another problem. Second to look at the early universe, for the early universe observations you are looking through the entire universe at the CMB and that requires foreground subtraction, which is very much a non trivial task. And finally from theory, General Relativity is a non-linear theory which means taking the average and then evolving the average does not necessarily yield the same result as evolving the initial state and then taking the average. Either of this could explain the tension, though an actually dynamic cosmological constant would be more fun.
> If the expansion rate is dropping, surely it is headed towards zero?
Expansion rate could be dropping but converging to a non-zero value. That seems unlikely to me, but it's an answer that would fit that description just fine.
The language is a bit imprecise, though, which I expect is the problem. The (to me) obvious technical interpretation of "expansion rate is headed towards zero" is that d size(t)/dt -> 0 as t -> infinity, but the (again, to me) obvious non-technical interpretation is "expansion will completely stop at some point". So "*not* headed towards zero" means "derivative isn't going to zero", or "expansion never quite stops", respectively.
The derivative of ln(t) does go to zero, but it has unbounded growth, so it fails the first test but passes the second. The universe experiencing logarithmic expansion seems reasonable enough.
It's confusing because that quote about "expansion rate decreasing" is in the caption of a picture with a giant "Accelerating Expansion" label at present time.
Agreed; the presentation was, IMHO, terrible. You had to read each image's relatively hard to read annotation to make sense of what they were trying to say. Hell, 95% of the entire article was image annotations, with single sentences between. Really difficult to follow along.
>However, for the past ~6 billion years, distant galaxies have been speeding up in their recession, and the expansion rate, though still dropping, is not headed toward zero.
Am I not getting something here or is this sentence not make sense?
The first image depicting space expansion in that article: what happens if some is to travel beyond that space boundary? Is it because of speed of light that no one can catch up beyond it or is it something else?
The graphic is an attempt to project the 3D observable universe down to 2D and then depict the time evolution of it to show that it has expanded in spatial volume, with the bell shape showing extremely rapid expansion during cosmic inflation, then slowing expansion, then accelerating expansion again as matter density became low enough for dark energy density to take over. But the observable universe is not the entire universe. There is no known boundary to space and there probably is no boundary at all. We'll never have any way of knowing, but space could easily be infinite in extent, while still expanding at every point. If you were on the boundary depicted (the cosmic horizon of the Earth-centered observable universe), or even if you were outside of that completely, the picture from your vantage point should look exactly the same.
As for what the horizon represents, it's all light that could have reached Earth within 13.7 billion years, that is, Earth's past light cone. It isn't necessarily the case that nothing leaving Earth today could ever get past that horizon. Light can travel forever if nothing ever scatters it. Given some very large amount of time, it can beyond the boundary depicted there. Because of the expansion of space, however, that time is likely to be more than 13.7 billion years, and given the acclerating expansion of space, there will eventually come a time that no light leaving any point within our local supercluster can ever reach any other supercluster. They will become forever outside of each other's horizons.
The Hubble tension is "bad" (i.e. shows a problem with) for current models of cosmology, hence strengthening it is making it "worse." Its like a hurricane, a strengthening hurricane is worse.
[+] [-] bmitc|2 years ago|reply
1. The universe is expanding
2. The velocity at which the universe is expanding is increasing, since there is non-zero acceleration.
3. The acceleration is actually slowing down, but it is not heading towards zero. It is approaching some steady state acceleration.
----------
In Exploring Black Holes by Edwin Taylor and John Wheeler, published in 2000, I recently read the following passage by Wheeler:
> An article by John Noble Wilford in the Science Times section of the New York Times for Tuesday March 3, 1998, reports observations by two separate groups of investigators which they interpret as showing that today the expansion of the Universe is speeding up rather than undergoing the slowdown expected for any approach to maximum expansion. Later that day I encountered a hard-bitten veteran gravitation physics colleague in the elevator of the Princeton physics building and asked him if he believed the purported evidence of accelerating expansion. "No," he replied. Neither do I. Why not? Two reasons: (1) Because the speed-up argument relies too trustingly on the supernovas being standard candles, (2) Because such an expansion would, it seems to me, contradict a view of cosmology too simple to be wrong. Such clashes between theory and experiment have often triggered decisive advances in physics. We can hope that some decisive advance is in the offing.
Now, Wheeler was one of the most preeminent physicists of the 20th century and spearheaded several fields and was not afraid of wild ideas. I am curious if anyone knows if he had any inklings that further his thoughts here. Was he just wrong here, before new measurements came out? Or is there something missing in these calculations and the theory that we're missing? I'm also not sure what means in (1).
[+] [-] mr_mitm|2 years ago|reply
> 3. The acceleration is actually slowing down, but it is not heading towards zero. It is approaching some steady state acceleration.
No. The scale factor in LCDM (the current concordance model) is proportional to sinh^(2/3)(t/t_Lambda), which means it's asymptotically exponential. Acceleration will always increase.
(Edit: The expansion rate, so a'/a, will go to a constant value, though.)
Caveat: If the Hubble tension proves real, LCDM is wrong.
[+] [-] nonameiguess|2 years ago|reply
To (1), he's talking about the distance ladder. Distances to various galaxies were computed by the teams involved in developing the LCDM model by scouring the skies for instances of type IA supernovae in this galaxies. When these are called "standard candles," what that means is that the luminosity is known and thus the distance can be computed using the inverse square law by comparing the observed brightness to the known luminosity. Historically, this wasn't a super-reliable method and yielded some bounded error rate for various reasons, including dust causing some reddening of the light that could be mistaken for redshift, as well as the fact that type IA supernovae are not all collapsing stars of exactly equal mass and you might catch them at different times after the initial supernova.
What Wheeler probably didn't realize at that point was the methods and frankly just downright tedious repeated measurement work over the course of decades that astronomers had been doing to make this more reliable and accurate. If you want to know all about this via first-hand account from one of the astronomers credited with the dark energy discovery, read The Extravagant Universe by Robert Kirschner. It's very accessible, not super-technical, but goes into painstaking detail on exactly how astronomers made the type IA work as a standard candle.
It's also just a great testament to the work involved in experimental astronomy, down to the logistics of convincing a scheduling committee for the various telescopes capable of seeing that far that what you're looking for is worth looking for.
[+] [-] naasking|2 years ago|reply
Some physicists are contesting this:
https://arxiv.org/abs/1808.04597
[+] [-] fjfaase|2 years ago|reply
[1] https://arxiv.org/abs/2307.15806
[+] [-] joshuahedlund|2 years ago|reply
(I assume it’s not that simple, but it’s very hard for me to understand and reason about data that involves looking back in time and space simultaneously)
[+] [-] mr_mitm|2 years ago|reply
[+] [-] privong|2 years ago|reply
This is already generally being accounted for (assuming we have the correct cosmological model for the Universe). The CMB measurement is probing the Universe at an early state (when the Universe was ~380,000 years old) and the Hubble "constant", H(t=380,000 yr), was different at that point from what it is now, H_0 = H(t=13.7 billion years). The comparison to the current Hubble constant (e.g., determined from Cepheids) is made by evolving the CMB Hubble "constant" value to today, under the assumption of our best cosmological model model.
Of course, if we have the wrong cosmological model, then that could be a reason for the discrepancy, and that might point to "new physics". Alternately there may be some systematic in the Cepheid technique that is causing our H_0 estimate to be somewhat off.
[+] [-] venusenvy47|2 years ago|reply
[+] [-] benj111|2 years ago|reply
>distant galaxies have been speeding up in their recession, and the expansion rate, though still dropping, is not headed toward zero.
If the expansion rate is dropping, surely it is headed towards zero? Or are they using expansion rate to mean acceleration and the zero refers to the recession. Or am I misunderstanding something?
[+] [-] irishsultan|2 years ago|reply
Hopefully you can see that if this series continues then the expansion rate is always dropping, but it's headed towards 1, not 0. (And if expansion rate of 1 is too confusing in this context imagine if it starts out at 3 and goes to 2.5, 2.25, 2.125, ..., it still is always decreasing, but it will never be less than 2 which means the universe keeps expanding).
[+] [-] yk|2 years ago|reply
Not necessarily. First the distance ladder could have another problem. Second to look at the early universe, for the early universe observations you are looking through the entire universe at the CMB and that requires foreground subtraction, which is very much a non trivial task. And finally from theory, General Relativity is a non-linear theory which means taking the average and then evolving the average does not necessarily yield the same result as evolving the initial state and then taking the average. Either of this could explain the tension, though an actually dynamic cosmological constant would be more fun.
[+] [-] pdpi|2 years ago|reply
Expansion rate could be dropping but converging to a non-zero value. That seems unlikely to me, but it's an answer that would fit that description just fine.
The language is a bit imprecise, though, which I expect is the problem. The (to me) obvious technical interpretation of "expansion rate is headed towards zero" is that d size(t)/dt -> 0 as t -> infinity, but the (again, to me) obvious non-technical interpretation is "expansion will completely stop at some point". So "*not* headed towards zero" means "derivative isn't going to zero", or "expansion never quite stops", respectively.
The derivative of ln(t) does go to zero, but it has unbounded growth, so it fails the first test but passes the second. The universe experiencing logarithmic expansion seems reasonable enough.
[+] [-] sornaensis|2 years ago|reply
[+] [-] venusenvy47|2 years ago|reply
[+] [-] denton-scratch|2 years ago|reply
[+] [-] fooqux|2 years ago|reply
[+] [-] dhuk_2018|2 years ago|reply
[+] [-] EdwardDiego|2 years ago|reply
[+] [-] _madmax_|2 years ago|reply
Am I not getting something here or is this sentence not make sense?
[+] [-] vidarh|2 years ago|reply
[+] [-] layer8|2 years ago|reply
[+] [-] XzAeRosho|2 years ago|reply
[+] [-] BurningFrog|2 years ago|reply
[+] [-] m3kw9|2 years ago|reply
[+] [-] nonameiguess|2 years ago|reply
As for what the horizon represents, it's all light that could have reached Earth within 13.7 billion years, that is, Earth's past light cone. It isn't necessarily the case that nothing leaving Earth today could ever get past that horizon. Light can travel forever if nothing ever scatters it. Given some very large amount of time, it can beyond the boundary depicted there. Because of the expansion of space, however, that time is likely to be more than 13.7 billion years, and given the acclerating expansion of space, there will eventually come a time that no light leaving any point within our local supercluster can ever reach any other supercluster. They will become forever outside of each other's horizons.
[+] [-] uoaei|2 years ago|reply
https://youtu.be/AwwIFcdUFrE
[+] [-] fluoridation|2 years ago|reply
The observable universe does have an event horizon that can't be reached due to the finiteness of the speed of light, though.
[+] [-] croes|2 years ago|reply
[+] [-] enkid|2 years ago|reply
[+] [-] earthboundkid|2 years ago|reply