“Supernova (SN) cosmology is based on the key assumption that the luminosity standardization process of Type Ia SNe remains invariant with progenitor age. However, direct and extensive age measurements of SN host galaxies reveal a significant (5.5σ) correlation between standardized SN magnitude and progenitor age, which is expected to introduce a serious systematic bias with redshift in SN cosmology. This systematic bias is largely uncorrected by the commonly used mass-step correction, as progenitor age and host galaxy mass evolve very differently with redshift. After correcting for this age bias as a function of redshift, the SN data set aligns more closely with the cold dark matter (CDM) model” [1].
I know the team that did this. In fact i was listening to their seminar just a few days ago. They are very careful and have been working on this a long time. One caveat that they readily admit is that the sample used to create the luminosity age relation has some biases such as galaxy type and relatively lower redshift. They will be updating their results with the Rubin LSST data in the next few years.
Exciting times in cosmology after decades of a standard LCDM model.
I did a deep dive into cosmology simulations ~a year ago. It was striking how much is extrapolated from the brightness of small numbers of galaxy-surface pixels. I was looking at this for galaxies and stars, and observed something similar. The cosmology models are doing their best with sparse info, but to me it seemed that the predictions about things like Dark Matter and Dark Energy are presented in a way that's too confident for the underlying data. Not enough effort is spent trying to come up with new models. (Not to mention trying to shut down alternatives to Lambda CDM, or a better understanding of the consequences of GR, and the assumptions behind applying Newtonian instant-effect gravity in simulations).
Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.
> type Ia supernovae, long regarded as the universe’s "standard candles", are in fact strongly affected by the age of their progenitor stars.
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
It could be a big discovery and it also aligns with the findings from DESI BAO [1] and by another Korean group using galaxy clustering to infer the expansion history [2].
I'm dumb and barely understand things at a high level, but standard candles never sat right with me so it's interesting to hear that they might not be, but then again who knows.
This is mostly my physics ignorance talking, but if we measure distance in space-time and not just space, and speed or velocity is space-time/time (which somehow are both relative to each other) and the derivative of velocity is acceleration, cant acceleration mean either expanding "faster" in the sense of distance OR time speeding up or slowing down? All of it seems so self referential its hard to wrap around.
Seems like the problem should be pretty easy to figure out. Just need to wait ~5 gigayears and see which model is right. I'm personally hoping for deceleration so that we have more total visitable volume.
I'll set a reminder to check back at that time to see who was right.
With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.
Anyone know how credible this is? If true, then that means the big bounce is back on the menu, and the universe could actually be an infinitely oscillating system.
At least The Guardian has a comment from an independent expert:
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"
> If true, then that means the big bounce is back on the menu
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
“We can’t observe the whole universe, so cosmology is not really about the universe. It’s about the observable patch and the assumptions we make about the rest.”
(paraphrasing George Ellis)
We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.
The more we learn, the less we end up knowing about how "everything" works - some things are mathematical in nature and demonstrate absolutes, but frameworks shift, and complexify, and exceptions to things we thought absolutes have occurred throughout history.
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
Standard candles (all these measurements of redshift according to distance, need us to actually get the distance of what we are measuring right) are the gift that keeps on giving.
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
"Remarkably, this agrees with what is independently predicted from BAO-only or BAO+CMB analyses, though this fact has received little attention so far.""
I would not be surprised if the universe was somewhat elastic, expands and then contracts and then expands ad infinitam.
After all, existence in itself is irrefutable and cannot not exist by definition.
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws.
Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc...
Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
Eventually we will find that the heat death of the universe and the big bang are the same thing, since the totality of the universe is always a oneness, then from the universal perspective the infinitely small and infinitely large are the same thing (one), then they by nature bleed into (and define) each other like yin and yang.
A funny coincidence is that the solar system was formed 4.6 billion years ago which is exactly when the universe's rate of expansion peaked according to figure 3.
If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
As a non-scientist I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime, and the compounding of error at each rung of the ladder. Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB can also have problems if our assumptions about it are wrong. A major goal of having two methods is that they should coalesce to the same result within margin of error - that they didn't told us we were missing something.
> I've always found the Cosmic Distance Ladder as likely to be inaccurate due its assumption about the constant brightness of Standard Candle stars over their lifetime
Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.
> Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.
The supernovas type Ia luminosity depends on their composition and that takes into account both the age of the supernova and of the donator star. And that can be inferred by the luminosity curve of the supernova.
If they are replacing a fixed cosmological constant by a model with variable dark energy, doesn't it introduce extra parameters that describe the evolution of dark energy over time? If so, wouldn't it lead to overfitting? Can overfitting alone explain better match of the new model to the data?
This is a fascinating discovery! It's brings into focus the Deep Field imagery from the JWST and how gravitational lensing was found to be greater than expected along with galaxies that were much older than expected based on redshift calculations. Perhaps this could indicate that the universe is even older than we originally thought if redshift calculations accounted for an incorrect perpetual acceleration.
> The corrected supernova data and the BAO+CMB-only results both indicate that dark energy weakens and evolves significantly with time.
> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ΛCDM model was ruled out with overwhelming significance, the researchers said.
I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?
The CMB and BAO measurements give us a picture of how the early universe looked. Supernovae are sensitive to the conditions in the late universe. All probes, which are mostly independent, always pointed at the same amount of dark energy.
Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)
Was there a date at the top of this? I didn't see one. I saw a similar headlines earlier this year and I'm trying to understand that this is something new
You're probably thinking of the DESI BAO results from March, which also cast doubt on the standard cosmological model. These new results point further in the same direction as the DESI ones
Aside from unanswerable questions (has the universe started to fill it's container? Is a simulation property nearing "1"?), does this make long distance space travel feasible again? I thought there was something around the universe is expanding too fast to visit places like Alpha Centuri (and preventing visitors to us).
Edit: A big brain fart, ignore the retracted part below. Colonizing the universe is of course impossible in 100My, barring FTL. What the paper I referred to [1] says is that colonizing the Milky Way may take less than that, and if you can do that, spreading to the rest of the observable universe is fairly easy, very relatively speaking.
<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>
The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.
That limitation only counts for visiting other galaxies. Travel within the galaxy is always possible, regardless of the universe’s expansion. And Alpha Centauri is super close, even within our galaxy.
The limit to space travel is the Rocket Equation, which says that you require exponential fuel to reach higher speeds. Alpha Centauri isn't going anywhere, but it will take millennia of travel even with wildly optimistic assumptions.
Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.
Probably it means that now we have evidence that… it is a colloquialism
Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
I have a great deal of respect for the sciences but sometimes astronomy just feels like one giant guessing game: age of the universe, big bang starting as a joke and all the "first minute" timelines thereafter, dark energy and dark matter (code for we have no idea what it is) vastly outnumbering everything else, and now questioning the Nobel Prize-awarded universe expansion. Meanwhile, asteroids the size of buses+ keep whizzing by closer than the moon with little or no warning. Sigh.
Consider the scales involved. It's amazing that a species that is 99% chimp genes can even think and deduce phenomena of that size; don't ask it to get it right the first time.
All of that without having traveled farther than one light second from its home.
> now questioning the Nobel Prize-awarded universe expansion
It is not questioning that the universe is expanding. It is questioning how the expansion is happening. Massive difference. The rate of expansion has always been more of a "probably" and "looks like" rather than "we have very strong evidence" (unlike expansion itself, for which there is very strong evidence). This is a classic "we have tweaked our model as we've learned more" type thing (assuming it holds).
I mean an asteroid the size of a bus is messy for your local area if it decides to land there, but in the terms of size of things in space is nearly undetectable. Space, even our local neighborhood is unbelievably huge.
Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.
Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.
There has been a lot of progress towards mapping all near-earth asteroids, at least. That's a lot better than the previous tactic of putting one's fingers in one's ears and humming.
Mainstream physics has been delighted to ignore/abandon essential conservation laws when talking about the expanding universe. It's kinda weird, I tried publishing a paper on it recently and it was not received well. In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious? Apparently it was quite contentious to say until... this article?
Noether's theorem tells us when we would expect conservation laws to hold and when we would expect them to fail. In the case of global energy conservation, there would have to be a global time invariance associated with the spacetime. But this is manifestly not the case in an expanding universe. It is generally not even possible to have a well defined notion of global energy in a dynamic spacetime.
> In general, if conservation laws are to hold, expansion must be balanced with [eventual] contraction, is that not obvious?
Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.
Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.
> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they “forgot” it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.
JumpCrisscross|3 months ago
[1] https://academic.oup.com/mnras/article/544/1/975/8281988?log...
negativelambda|3 months ago
Exciting times in cosmology after decades of a standard LCDM model.
the__alchemist|3 months ago
Whenever I read things like "This model can't explain the bullet cluster, or X rotation curve, so it's probably wrong" my internal response is "Your underlying data sources are too fuzzy to make your model the baseline!"
I think the most established models are doing their best with the data they have, but there is so much room for new areas of exploration based on questioning assumptions about the feeble measurements we can make from this pale blue dot.
nabakin|3 months ago
A key point in the article. From what I understand, this is the main way we measure things of vast distance and, from that, determine the universe's rate of expansion. If our understanding of these supernovae is wrong, as this paper claims, that would be a massive scientific breakthrough.
I'm really interested in the counterargument to this.
negativelambda|3 months ago
[1] https://arxiv.org/abs/2404.03002
[2] https://arxiv.org/abs/2305.00206
gorbot|3 months ago
basch|3 months ago
stronglikedan|3 months ago
Indeed. It's so hard to definitively prove things that are, that the most significant breakthroughs prove things that aren't (so to speak), imho.
ertgbnm|3 months ago
I'll set a reminder to check back at that time to see who was right.
shadowgovt|3 months ago
With 5 gigayears to work with I'm going to move a few star systems over, break down all the matter orbiting the star into a Dyson sphere made of computronium, and simulate visiting any world I could possibly ever want to.
mabster|3 months ago
jimbo808|3 months ago
jampekka|3 months ago
"Prof Carlos Frenk, a cosmologist at the University of Durham, who was not involved in the latest work, said the findings were worthy of attention. “It’s definitely interesting. It’s very provocative. It may well be wrong,” he said. “It’s not something that you can dismiss. They’ve put out a paper with tantalising results with very profound conclusions.”"
https://www.theguardian.com/science/2025/nov/06/universe-exp...
pdonis|3 months ago
I don't think so. Deceleration does not imply recollapse. AFAIK none of this changes the basic fact that there isn't enough matter in the universe to cause it to recollapse. The expansion will just decelerate forever, never quite stopping.
pdonis|3 months ago
AFAIK the previous models that all assumed that Type 1a supernovae were not affected by the age of the progenitor stars had no actual analysis to back that up; it was just the simplest assumption. This research is now actually doing the analysis.
khimaros|3 months ago
jumploops|3 months ago
(paraphrasing George Ellis)
We’re in a bounding sphere, with a radius that’s roughly 46.5 billion lightyears, so any observation we make may be true for our local observable range, but there’s no (known) way to know what’s beyond that sphere.
observationist|3 months ago
For claims about how the universe works at scales and timeframes so utterly beyond anything testable, it's a little difficult to say this is credible at all - not dunking on the researchers, but in order to validate their conclusions, there's a whole chain of dependencies and assumptions you'd have to follow along with, and each of those things will be its own complex birds nest tangle of assertions, and I don't see how you can really say one way or another until you have a lot more information and a lot better Theory of Everything than we've got right now.
For what it's worth, for all the impact it'll have on anyone's life outside of academia, I'd say they're 100% correct and people should buy them free beers at their local pubs for at least the next year in return for explaining their ideas at length.
DarmokJalad1701|3 months ago
ghtbircshotbe|3 months ago
ls612|3 months ago
wtcactus|3 months ago
This study (and many others, depending on the cosmic scales they use) mainly use Supernovas of Type Ia. I.e. the energy emitted by the supernova of a binary acreccion star, which is a star that is capturing the mass from another start that is very nearby and increasing its mass until it collapses into itself, increases temperature up to the point it starts fusing helium, and goes supernova with all the added energy.
That was (and still is now, with some corrections we found since middle last century) supposed to be the same everywhere. Problem is, we keep finding new corrections to it - like this study claims.
That is in fact the big claim of this study (ignore the universe expansion part), that they found a new correction to the Supernova of type Ia luminosity. It's a very big claim and extremely interesting if confirmed. But, like all big claims, it needs a big confirmation. I'm a bit skeptic TBH.
jldl805|3 months ago
Out of curiosity, what data are you drawing or what qualifications do you have that support your skepticism over three different modes of analysis (as well as pretty much every recent development in the field) supporting this claim:
aatd86|3 months ago
If we subscribe to a theory of the multiverse, set theory, likelihood, and interaction driven evolution based on gradient type of fundamental laws. Locally changing. Obviously everything sharing a fundamental quality that is part of existence itself. But obviously there are sets, there is differentiation. But it is not created, the infinity of unconstrained possibilities exists in the first place and reorganizes itself a bit like people are attracted to people who share some commonalities or have something they need from each other and form tribes. Same processus kind of works for synapse connections, works for molecule formations, works for atoms... etc... Everything is mostly interacting data.
We could say that the concept of distance is a concept of likelihood. The closer is also the most likely.
Just a little weird idea. I need to think a bit more about it. Somewhat metaphysic?
antonvs|3 months ago
I can say the same about forgnoz, which is something I've just invented that must exist by definition.
You'd need to try a bit harder to make existence actually inevitable.
bombdailer|3 months ago
sfink|3 months ago
Universe gong.
mrb|3 months ago
If you want to believe in an intelligent creator—not that I do—it's as if they were accelerating the expansion until the solar system was formed, then turned the control knob down.
ASalazarMX|3 months ago
JumpCrisscross|3 months ago
But wavering around a line above y = 0.
nQQKTz7dm27oZ|3 months ago
[deleted]
cosmicjoe|3 months ago
https://en.wikipedia.org/wiki/Cosmic_distance_ladder
XorNot|3 months ago
naasking|3 months ago
Stars are just basic nuclear physics and gravity, that's why they're expected to be stable and predictable.
> Direct measurement of the CMB seems to be simpler with less chance of error.
Direct measurement of the CMB doesn't tell you anything on its own, you have to interpret the data in terms of a model. If you have a completely different model, say one without dark energy or without dark matter, CMB measurements would tell you something different than LCDM.
wtcactus|3 months ago
https://en.wikipedia.org/wiki/Type_Ia_supernova
eterevsky|3 months ago
kittikitti|3 months ago
andrewflnr|3 months ago
> More importantly, when the corrected supernova data were combined with BAO and CMB results, the standard ΛCDM model was ruled out with overwhelming significance, the researchers said.
I notice they're not saying that dark energy is entirely unnecessary. Do we know if that's just default caution, or are there still strong reasons to believe dark energy exists?
mr_mitm|3 months ago
Now these people are saying SN actually point at zero dark energy, if accounting for the physics properly. That doesn't invalidate the CMB and BAO results. So dark energy must have had a big influence in the early universe, and no influence in the late universe, so it must by dynamic. (Ironically, supernovae were the first evidence for dark energy, which I guess was just a coincidence, if this new research is correct.)
redwood|3 months ago
observationist|3 months ago
At the very bottom. Weird how style guides keep putting important information like this in harder to reach places.
griffzhowl|3 months ago
https://newscenter.lbl.gov/2025/03/19/new-desi-results-stren...
felixfurtak|3 months ago
mr_mitm|3 months ago
https://xcancel.com/dscol/status/1987118124496249230
amai|3 months ago
unknown|3 months ago
[deleted]
ardit33|3 months ago
Roger Penrose seems to be leaning/more convinced of the circular universe theory....
denismenace|3 months ago
oofbey|3 months ago
candiddevmike|3 months ago
Sharlin|3 months ago
<retracted> According to some calculations, it should in principle be possible to colonize the entire observable universe in less than a hundred million years. It's much too fast for the expansion to affect except marginally.</retracted>
The relative jump in difficulty from interstellar to intergalactic is much smaller than from interplanetary to interstellar.
Anyway, as others said, mere intragalactic (and intra-Local Group) travel is not affected by expansion in any way whatsoever.
[1] https://www.sciencedirect.com/science/article/abs/pii/S00945..., PDF at https://www.aleph.se/papers/Spamming%20the%20universe.pdf
oofbey|3 months ago
indoordin0saur|3 months ago
jfengel|3 months ago
Also note that there isn't any "container" to fill up. It could well be infinite. It's just that we will be forever limited to a finite subset, even in theory.
WalterBright|3 months ago
gmuslera|3 months ago
unknown|3 months ago
[deleted]
johnwheeler|3 months ago
karakot|3 months ago
plasticchris|3 months ago
Edit: yep, The universe's expansion may actually have started to slow rather than accelerating at an ever-increasing rate as previously thought, a new study suggests.
thelibrarian|3 months ago
2OEH8eoCRo0|3 months ago
sermah|3 months ago
unknown|3 months ago
[deleted]
socrateswasone|3 months ago
[deleted]
animanoir|3 months ago
[deleted]
samdoesnothing|3 months ago
mrbluecoat|3 months ago
ASalazarMX|3 months ago
All of that without having traveled farther than one light second from its home.
arp242|3 months ago
It is not questioning that the universe is expanding. It is questioning how the expansion is happening. Massive difference. The rate of expansion has always been more of a "probably" and "looks like" rather than "we have very strong evidence" (unlike expansion itself, for which there is very strong evidence). This is a classic "we have tweaked our model as we've learned more" type thing (assuming it holds).
pixl97|3 months ago
Think of trying to find a bus that could be anywhere on earth that is moving so it's not easy to keep track of and is painted in a way to be camouflaged with its environment.
Now instead try to imagine looking for that bus on Jupiter. Gets way harder. But it's way bigger than that, your looking for a black dot in the size of an area of millions of Jupiter and just hope it crosses in front of a star so you can track it.
Most problems involving space are insanely hard.
spl757|3 months ago
CamperBob2|3 months ago
And of course, the people concerned with tracking near-earth asteroids are not connected in any way with cosmology.
seydor|3 months ago
CommenterPerson|3 months ago
There seem to be so many fudge factors in the whole chain of analysis we won't have an idea until we can make vastly improved measurements.
shomp|3 months ago
antognini|3 months ago
ASalazarMX|3 months ago
Why would this be? The only physics we know is the one inside our observable universe, there could be variations beyond, or even unknowable laws that don't require conservation of matter outside the edge of the universe.
Our incredibly vast universe could be a minuscule blob feeding from an incredibly vaster parent universe, in which case it could be breaking conservation infinitely from our perspective.
zygentoma|3 months ago
Also this discovery does still is being explained with dark energy (albeit time varying …) so it still does not assume global energy conservation.
mr_mitm|3 months ago
My favorite quote:
> I like to think that, if I were not a professional cosmologist, I would still find it hard to believe that hundreds of cosmologists around the world have latched on to an idea that violates a bedrock principle of physics, simply because they “forgot” it. If the idea of dark energy were in conflict with some other much more fundamental principle, I suspect the theory would be a lot less popular.
frotaur|3 months ago
Because there is no shortage of 'crackpots' that have 'obvious' solutions to unsolved physics problems, and that want to publish papers about it.