top | item 45840200

Analysis indicates that the universe’s expansion is not accelerating

261 points| chrka | 3 months ago |ras.ac.uk

208 comments

order

JumpCrisscross|3 months ago

“Supernova (SN) cosmology is based on the key assumption that the luminosity standardization process of Type Ia SNe remains invariant with progenitor age. However, direct and extensive age measurements of SN host galaxies reveal a significant (5.5σ) correlation between standardized SN magnitude and progenitor age, which is expected to introduce a serious systematic bias with redshift in SN cosmology. This systematic bias is largely uncorrected by the commonly used mass-step correction, as progenitor age and host galaxy mass evolve very differently with redshift. After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model” [1].

[1] https://academic.oup.com/mnras/article/544/1/975/8281988?log...

negativelambda|3 months ago

I know the team that did this. In fact i was listening to their seminar just a few days ago. They are very careful and have been working on this a long time. One caveat that they readily admit is that the sample used to create the luminosity age relation has some biases such as galaxy type and relatively lower redshift. They will be updating their results with the Rubin LSST data in the next few years.

Exciting times in cosmology after decades of a standard LCDM model.

the__alchemist|3 months ago

I did a deep dive into cosmology simulations ~a year ago. It was striking how much is extrapolated from the brightness of small numbers of galaxy-surface pixels. I was looking at this for galaxies and stars, and observed something similar. The cosmology models are doing their best with sparse info, but to me it seemed that the predictions about things like Dark Matter and Dark Energy are presented in a way that's too confident for the underlying data. Not enough effort is spent trying to come up with new models. (Not to mention trying to shut down alternatives to Lambda CDM, or a better understanding of the consequences of GR, and the assumptions behind applying Newtonian instant-effect gravity in simulations).

Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"

I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.

nabakin|3 months ago

> type Ia supernovae, long regarded as the universe’s "standard candles", are in fact strongly affected by the age of their progenitor stars.

A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.

I'm really interested in the counterargument to this.

gorbot|3 months ago

I'm dumb and barely understand things at a high level, but standard candles never sat right with me so it's interesting to hear that they might not be, but then again who knows.

basch|3 months ago

This is mostly my physics ignorance talking, but if we measure distance in space-time and not just space, and speed or velocity is space-time/time (which somehow are both relative to each other) and the derivative of velocity is acceleration, cant acceleration mean either expanding "faster" in the sense of distance OR time speeding up or slowing down? All of it seems so self referential its hard to wrap around.

stronglikedan|3 months ago

> If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.

Indeed. It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho.

ertgbnm|3 months ago

Seems like the problem should be pretty easy to figure out. Just need to wait ~5 gigayears and see which model is right. I'm personally hoping for deceleration so that we have more total visitable volume.

I'll set a reminder to check back at that time to see who was right.

shadowgovt|3 months ago

Oh, I'm not going to care about visitable volume.

With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.

mabster|3 months ago

I just pictured someone getting a message to check which model was right from an ancestor 20 giga generations ago!

jimbo808|3 months ago

Anyone know how credible this is? If true, then that means the big bounce is back on the menu, and the universe could actually be an infinitely oscillating system.

jampekka|3 months ago

At least The Guardian has a comment from an independent expert:

"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"

https://www.theguardian.com/science/2025/nov/06/universe-exp...

pdonis|3 months ago

> If true, then that means the big bounce is back on the menu

I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.

pdonis|3 months ago

> Anyone know how credible this is?

AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.

khimaros|3 months ago

time to re-read "The Last Question"

jumploops|3 months ago

“We can’t observe the whole universe, so cosmology is not really about the universe. It’s about the observable patch and the assumptions we make about the rest.”

(paraphrasing George Ellis)

We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.

observationist|3 months ago

The more we learn, the less we end up knowing about how "everything" works - some things are mathematical in nature and demonstrate absolutes, but frameworks shift, and complexify, and exceptions to things we thought absolutes have occurred throughout history.

For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.

For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.

ghtbircshotbe|3 months ago

How does an infinitely oscillating universe comply with the 2nd law of thermodynamics?

ls612|3 months ago

I’m gonna wait for Scott Manley to discuss it before I form much of an opinion.

wtcactus|3 months ago

Standard candles (all these measurements of redshift according to distance, need us to actually get the distance of what we are measuring right) are the gift that keeps on giving.

This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.

That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.

That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.

jldl805|3 months ago

>I'm a bit skeptic

Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:

       "Remarkably, this agrees with what is independently predicted from BAO-only or BAO+CMB analyses, though this fact has received little attention so far.""

aatd86|3 months ago

I would not be surprised if the universe was somewhat elastic, expands and then contracts and then expands ad infinitam. After all, existence in itself is irrefutable and cannot not exist by definition.

If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws. Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc... Everything is mostly interacting data.

We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.

Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?

antonvs|3 months ago

> After all, existence in itself is irrefutable and cannot not exist by definition.

I can say the same about forgnoz, which is something I've just invented that must exist by definition.

You'd need to try a bit harder to make existence actually inevitable.

bombdailer|3 months ago

Eventually we will find that the heat death of the universe and the big bang are the same thing, since the totality of the universe is always a oneness, then from the universal perspective the infinitely small and infinitely large are the same thing (one), then they by nature bleed into (and define) each other like yin and yang.

sfink|3 months ago

If you cover up the part of the Figure 3 graph past "now", it kind of fits a sine wave. https://ras.ac.uk/sites/default/files/2025-10/Figure%203.jpg

Universe gong.

mrb|3 months ago

A funny coincidence is that the solar system was formed 4.6 billion years ago which is exactly when the universe's rate of expansion peaked according to figure 3.

If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.

ASalazarMX|3 months ago

That's a very thought provoking observation, as if the whole universe behaved like a wave.

JumpCrisscross|3 months ago

> it kind of fits a sine wave

But wavering around a line above y = 0.

cosmicjoe|3 months ago

As a non-scientist I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime, and the compounding of error at each rung of the ladder. Direct measurement of the CMB seems to be simpler with less chance of error.

https://en.wikipedia.org/wiki/Cosmic_distance_ladder

XorNot|3 months ago

Direct measurement of the CMB can also have problems if our assumptions about it are wrong. A major goal of having two methods is that they should coalesce to the same result within margin of error - that they didn't told us we were missing something.

naasking|3 months ago

> I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime

Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.

> Direct measurement of the CMB seems to be simpler with less chance of error.

Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.

wtcactus|3 months ago

The supernovas type Ia luminosity depends on their composition and that takes into account both the age of the supernova and of the donator star. And that can be inferred by the luminosity curve of the supernova.

https://en.wikipedia.org/wiki/Type_Ia_supernova

eterevsky|3 months ago

If they are replacing a fixed cosmological constant by a model with variable dark energy, doesn't it introduce extra parameters that describe the evolution of dark energy over time? If so, wouldn't it lead to overfitting? Can overfitting alone explain better match of the new model to the data?

kittikitti|3 months ago

This is a fascinating discovery! It's brings into focus the Deep Field imagery from the JWST and how gravitational lensing was found to be greater than expected along with galaxies that were much older than expected based on redshift calculations. Perhaps this could indicate that the universe is even older than we originally thought if redshift calculations accounted for an incorrect perpetual acceleration.

andrewflnr|3 months ago

> The corrected supernova data and the BAO+CMB-only results both indicate that dark energy weakens and evolves significantly with time.

> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ΛCDM model was ruled out with overwhelming significance, the researchers said.

I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?

mr_mitm|3 months ago

The CMB and BAO measurements give us a picture of how the early universe looked. Supernovae are sensitive to the conditions in the late universe. All probes, which are mostly independent, always pointed at the same amount of dark energy.

Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)

redwood|3 months ago

Was there a date at the top of this? I didn't see one. I saw a similar headlines earlier this year and I'm trying to understand that this is something new

observationist|3 months ago

>>>Submitted by Sam Tonkin on Thu, 06/11/2025

At the very bottom. Weird how style guides keep putting important information like this in harder to reach places.

felixfurtak|3 months ago

the linked journal article is dated Nov 6 2025

ardit33|3 months ago

Circular universe...? big bang -> expands -> expansion slows -> starts retracting -> singularity again -> big bang again

Roger Penrose seems to be leaning/more convinced of the circular universe theory....

denismenace|3 months ago

Did it change during our life time?

oofbey|3 months ago

Just our understanding of it. That’s flipped multiple times in my lifetime.

candiddevmike|3 months ago

Aside from unanswerable questions (has the universe started to fill it's container? Is a simulation property nearing "1"?), does this make long distance space travel feasible again? I thought there was something around the universe is expanding too fast to visit places like Alpha Centuri (and preventing visitors to us).

Sharlin|3 months ago

Edit: A big brain fart, ignore the retracted part below. Colonizing the universe is of course impossible in 100My, barring FTL. What the paper I referred to [1] says is that colonizing the Milky Way may take less than that, and if you can do that, spreading to the rest of the observable universe is fairly easy, very relatively speaking.

<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>

The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.

Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.

[1] https://www.sciencedirect.com/science/article/abs/pii/S00945..., PDF at https://www.aleph.se/papers/Spamming%20the%20universe.pdf

oofbey|3 months ago

That limitation only counts for visiting other galaxies. Travel within the galaxy is always possible, regardless of the universe’s expansion. And Alpha Centauri is super close, even within our galaxy.

indoordin0saur|3 months ago

The universe was always only expanding between galaxies, not within them.

jfengel|3 months ago

The limit to space travel is the Rocket Equation, which says that you require exponential fuel to reach higher speeds. Alpha Centauri isn't going anywhere, but it will take millennia of travel even with wildly optimistic assumptions.

Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.

WalterBright|3 months ago

I picked the wrong week to put my faith in cosmology!

gmuslera|3 months ago

Someone dumped a flat panel near a noisy planet.

johnwheeler|3 months ago

Just because infinity is a hard thing to understand doesn't mean the universe is and has always been infinite.

karakot|3 months ago

What does 'now' mean here?

plasticchris|3 months ago

Probably it means that now we have evidence that… it is a colloquialism

Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.

thelibrarian|3 months ago

Going by the second graph, since about 2.5 billion years ago.

2OEH8eoCRo0|3 months ago

What happened to then?

sermah|3 months ago

Recent years, probably because of large data centers /s

mrbluecoat|3 months ago

I have a great deal of respect for the sciences but sometimes astronomy just feels like one giant guessing game: age of the universe, big bang starting as a joke and all the "first minute" timelines thereafter, dark energy and dark matter (code for we have no idea what it is) vastly outnumbering everything else, and now questioning the Nobel Prize-awarded universe expansion. Meanwhile, asteroids the size of buses+ keep whizzing by closer than the moon with little or no warning. Sigh.

ASalazarMX|3 months ago

Consider the scales involved. It's amazing that a species that is 99% chimp genes can even think and deduce phenomena of that size; don't ask it to get it right the first time.

All of that without having traveled farther than one light second from its home.

arp242|3 months ago

> now questioning the Nobel Prize-awarded universe expansion

It is not questioning that the universe is expanding. It is questioning how the expansion is happening. Massive difference. The rate of expansion has always been more of a "probably" and "looks like" rather than "we have very strong evidence" (unlike expansion itself, for which there is very strong evidence). This is a classic "we have tweaked our model as we've learned more" type thing (assuming it holds).

pixl97|3 months ago

I mean an asteroid the size of a bus is messy for your local area if it decides to land there, but in the terms of size of things in space is nearly undetectable. Space, even our local neighborhood is unbelievably huge.

Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.

Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.

Most problems involving space are insanely hard.

spl757|3 months ago

There has been a lot of progress towards mapping all near-earth asteroids, at least. That's a lot better than the previous tactic of putting one's fingers in one's ears and humming.

CamperBob2|3 months ago

That's a feature! If you want to be certain, you need religion, not science.

And of course, the people concerned with tracking near-earth asteroids are not connected in any way with cosmology.

seydor|3 months ago

We need an index tracking the expansion rate . And an ETF on it

CommenterPerson|3 months ago

Maybe someone is tailgating it. And it's trying to annoy them by speeding up, then slowing down.

There seem to be so many fudge factors in the whole chain of analysis we won't have an idea until we can make vastly improved measurements.

shomp|3 months ago

Mainstream physics has been delighted to ignore/abandon essential conservation laws when talking about the expanding universe. It's kinda weird, I tried publishing a paper on it recently and it was not received well. In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious? Apparently it was quite contentious to say until... this article?

antognini|3 months ago

Noether's theorem tells us when we would expect conservation laws to hold and when we would expect them to fail. In the case of global energy conservation, there would have to be a global time invariance associated with the spacetime. But this is manifestly not the case in an expanding universe. It is generally not even possible to have a well defined notion of global energy in a dynamic spacetime.

ASalazarMX|3 months ago

> In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious?

Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.

Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.

zygentoma|3 months ago

No, the assumption was that dark energy is a property of space itself so it does not conserve energy at all in an expanding space.

Also this discovery does still is being explained with dark energy (albeit time varying …) so it still does not assume global energy conservation.

mr_mitm|3 months ago

Maybe this helps: https://www.preposterousuniverse.com/blog/2010/02/22/energy-...

My favorite quote:

> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they “forgot” it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.

frotaur|3 months ago

I mean no disrespect, but are you a trained physicist, or at least familiar with the 'mainstream material'?

Because there is no shortage of 'crackpots' that have 'obvious' solutions to unsolved physics problems, and that want to publish papers about it.