Clocks run at "normal" speed (i.e. "1x" speed) in the absence of a gravitational field. The stronger the gravity, the slower they run (i.e. less than "1x" speed).
This has always felt to me like evidence of a sort of computationalism. I am not a computationalist, but the thought is the "universal CPU" needs cycles for each particle. Mass is what takes time to process, so the voids experience no/less computational delay. This reads like the simulation author is messy and constrained, not godlike.
To me it's not about mass, but more like "maximum information density". There's a limit on the information density (rate of happening?), so when a parameter X changes too much, it affects other parameters -- they become constrained so that the total information density stayed within the maximum limit. That would indeed sound like some kind of computational limit if the universe was a massive CPU with constrained resources...
But I'm a layperson and I have no idea what I'm talking about :)
Right so is the paper saying that lambda CM completely ignored clock differences due to heterogeneity in mass distribution in the universe where isolated galaxies would be experiencing less time slowing than galaxies near other galaxies which would experience more time dilation?
In the standard cosmology the Integrated Sachs-Wolfe effect captures the redshift/blueshift of distant light sources (up to the Cosmic Microwave Background) as it traverses relatively dense regions and relative voids.
Note that in the next paragraph I depart significantly from the vocabulary that the Timescapes programme proponents have been using for the past twenty years.
ISW and comparable spectroscopy is easy enough to think about in terms of an accelerating cosmic expansion, i.e., relative voids are becoming spatially bigger with the expansion. It becomes much less intuitive how to fit the data if one keeps relative voids at roughly constant volume instead implying that there is a significant false vacuum above the ground state and in voids the false vacuum is slowly decaying to that state. (Outside the supervoids, near matter, this false vacuum decays much more slowly still). Because "vacuum" in the voids isn't really vacuum, one is stuck with a running function on the constant c (it gets faster with time from the formation of the CMB; this is because the false vacuum evolves towards a real vacuum) or adapting lightlike geodesics by imposing refraction (since the false vacuum is a medium).
The usual terminology is reasonably capture in the first paragraph here at <https://en.wikipedia.org/wiki/Inhomogeneous_cosmology#Inhomo...> ("Inhomogeneous universe"). The following short section ("Perturbative approach") is what is done in the standard cosmology when one wants to do detailed studies of filamentary distributions and other structures that are lumpy at some (larrrrrge) length scale of interest: the perturbed homogenous background is practically always the standard FLRW.
The justification for perturbation theory on FLRW is that even though there are dense spots (notably most galaxies' central black holes), principles like the Birkhoff theorem capture the idea that as you get far enough away from a galaxy it behaves more and more like a small shell, and this happens at intragalactic scales for these SMBHs: gravitationally, even to its arms' structure, it makes practically no difference whether Andromeda's central bulge has a lot more stars/gas/dust or whether it has one, two, or six central SMBHs (at enough spatial separation that they're not mutually orbiting in a way that would generate gravitational radiation our observatories are sensitive to).
The same idea applies to galaxies->galaxy clusters->filamentary structures: as you "zoom out" the density variations become less important: filaments are pretty sparse on average.
The Timescapes programe wants a sharper difference in matter sparseness between voids and filaments, and proposes that gravitational backreaction by the matter is responsible for generating that: the presence of matter steepens the density of matter over time (without the visible matter clearly becoming denser). I don't personally see how that's much different from a false-vacuum decay in the voids, conceptually. (ETA: well, it depends somewhat on how the Timescape void fraction evolves, but the local universe VF doesn't run void clocks fast enough, unless we do violence to the Copernican principle.)
Finally, I think the most important result of this latest Timescapes paper is a reminder to everyone that supernova data are a mess. A good X-mas present would be a couple readily visible Milky Way supernovae.
jchanimal|1 year ago
kgeist|1 year ago
But I'm a layperson and I have no idea what I'm talking about :)
jryan49|1 year ago
vlovich123|1 year ago
raattgift|1 year ago
https://en.wikipedia.org/wiki/Sachs%E2%80%93Wolfe_effect
Note that in the next paragraph I depart significantly from the vocabulary that the Timescapes programme proponents have been using for the past twenty years.
ISW and comparable spectroscopy is easy enough to think about in terms of an accelerating cosmic expansion, i.e., relative voids are becoming spatially bigger with the expansion. It becomes much less intuitive how to fit the data if one keeps relative voids at roughly constant volume instead implying that there is a significant false vacuum above the ground state and in voids the false vacuum is slowly decaying to that state. (Outside the supervoids, near matter, this false vacuum decays much more slowly still). Because "vacuum" in the voids isn't really vacuum, one is stuck with a running function on the constant c (it gets faster with time from the formation of the CMB; this is because the false vacuum evolves towards a real vacuum) or adapting lightlike geodesics by imposing refraction (since the false vacuum is a medium).
The usual terminology is reasonably capture in the first paragraph here at <https://en.wikipedia.org/wiki/Inhomogeneous_cosmology#Inhomo...> ("Inhomogeneous universe"). The following short section ("Perturbative approach") is what is done in the standard cosmology when one wants to do detailed studies of filamentary distributions and other structures that are lumpy at some (larrrrrge) length scale of interest: the perturbed homogenous background is practically always the standard FLRW.
The justification for perturbation theory on FLRW is that even though there are dense spots (notably most galaxies' central black holes), principles like the Birkhoff theorem capture the idea that as you get far enough away from a galaxy it behaves more and more like a small shell, and this happens at intragalactic scales for these SMBHs: gravitationally, even to its arms' structure, it makes practically no difference whether Andromeda's central bulge has a lot more stars/gas/dust or whether it has one, two, or six central SMBHs (at enough spatial separation that they're not mutually orbiting in a way that would generate gravitational radiation our observatories are sensitive to).
The same idea applies to galaxies->galaxy clusters->filamentary structures: as you "zoom out" the density variations become less important: filaments are pretty sparse on average.
The Timescapes programe wants a sharper difference in matter sparseness between voids and filaments, and proposes that gravitational backreaction by the matter is responsible for generating that: the presence of matter steepens the density of matter over time (without the visible matter clearly becoming denser). I don't personally see how that's much different from a false-vacuum decay in the voids, conceptually. (ETA: well, it depends somewhat on how the Timescape void fraction evolves, but the local universe VF doesn't run void clocks fast enough, unless we do violence to the Copernican principle.)
(Also ETA, mostly a note-to-self: I also don't understand how they capture the angular diameter turnover point in their dressed geometry <https://journals.aps.org/prd/abstract/10.1103/PhysRevD.80.12...> PDF available from institution at <https://ir.canterbury.ac.nz/items/36fe829a-0e7a-45d6-8db6-c2...> (cf <https://astronomy.stackexchange.com/questions/21006/understa...>.))
Finally, I think the most important result of this latest Timescapes paper is a reminder to everyone that supernova data are a mess. A good X-mas present would be a couple readily visible Milky Way supernovae.
-