* spacetime itself literally “creates itself out of nothingness”. We say it “stretches”, but don't picture a string of cheese “thinning out” but rather something of constant density being continuously expanded.
* local groups (of galaxies) are gravitationally bound i.e they won’t scatter around and they attract each other with gravity so they will stay together even though more spacetime is created inside of them.
* in the empty “space” between local groups something (dark matter? Dark energy? ) causes this steady continuous creation of new spacetime
* this creation of spacetime is itself accelerating, but this is not speed, so distant local groups are not “moving away from us” but are being “displaced away from us”. Hence the displacement rate is allowed to be faster than the speed of light (since it’s not a speed)
* every single object in the universe moves at exactly and precisely the speed of light c through spacetime. Objects at rest (a completely stationary astronaut stranded in the empty space between local groups) is also moving at the speed of light in spacetime: he is moving only through time.
* A photon is moving only through space and has no time
Sources: The Elegant Universe, Brian Greene, ScienceClic YouTube, and my memory of these
> local groups (of galaxies) [have] more spacetime [...] created inside of them
No. For all practical purposes (FAPP), galaxy clusters are not expanding. Swiss cheese cosmologies where Lemaître-Tolman-Bondi (LTB) solution vacuoles are embeded within the usual Friedmann-Lemaître-Robertson-Walker (FLRW) cosmological solution generate observables that closely match luminosity-redshift and angular diamater-redshift relations for active galactic nuclei (SN Ia supernovae) extremely well.
LTB is a contracting spacetime -- there simply is no expansion within the vacuole, mathematically. Adding in expansion gives you something other than LTB, which leads to a different redshift for AGNs and SNs in the same cluster when correcting for peculiar motion within the cluster. The evidence favours LTB, and the physical interpretation is that the galaxy clusters and the gas surrounding them are condensing into a point over cosmologically long periods, while the galaxy clusters separate according to the surrounding expanding FLRW solution (which captures the Lyman-alpha forest, and the CMB redshift, among other observables).
Moreover, at the scale of the solar system, there is simply no measurable metric expansion. FAPP the solar system matches an LTB solution.
Since LTB solutions are asymptotically flat, we can nest them hierarchically, using an Israel-Darmois thin shell junction to capture how radiation and other matter moves between the nested LTB solutions and between the "outer" LTB solution for the vacuole in general and the FLRW solution. This works reasonably well, and there have been good numerical approaches since the mid-1990s (Musgrave, 1996).
> every single object in the universe moves exactly and precisely at the speed of light through spacetime
No. In General Relativity velocity vectors are ambiguous except at a coincident point. At a point occupied by a massless particle or wave (a photon or classical light, for instance) and one occupied by a massive particle, one can clearly and unambiguously demonstrate that the former is faster.
In a Lorentzian manifold, like our universe, there is a clear distinction between lightlike and timelike geodesics; light couples to one but not the other, and anything massive couples only to timelike (and not lightlike) geodesics. This is an inevitable consequence of the 1+3 dimensional geometry: https://en.wikipedia.org/wiki/Causal_structure#Tangent_vecto...
Light (and other massless waves or particles, those with "null" mass) simply do not have access to the same tangent vectors as massive objects, and vice-versa. This is extremely well supported by experiment and observation: https://en.wikipedia.org/wiki/Modern_searches_for_Lorentz_vi...
> a photon is moving only through space and has no time
A classical light wave is either at a point in spacetime or it is not. Quantum mechanical photons do not much differ.
"Movement through spacetime" is because you have chosen to split up the whole spacetime into spaces oriented by time, turning a worldline/worldtube into a set of time-ordered points. But your choice of splitting is not the right splitting any more than anyone elses. I can choose to split the universe into spaces oriented along the path of a photon emitted from a distant galaxy to my CCD detector. Or from the lightbulb across the room to my eyeball. It was obviously still in the distant galaxy, or in the filament, in my idea of the past, and a bit less past it was in flight between the two (if you believe in physical realism). That is, applying our favoured splitting to a system does not change the underlying worldtube, only how we label any given part of it.
One can carve a lightlike geodesic up however you like, for example, by assigning any sort of ordered labels between the emission and absorption. Done carefully, you can use the affine parameter which has some useful properties for calculating things like redshift (details: https://physics.stackexchange.com/questions/17509/what-is-th... ).
What one cannot do is just use the same proper time as we can on timelike geodesics. That's because proper time is defined for timelike geodesics, and those cannot be occupied by light.
Moreover, we return to the point above about ambiguity. By fixing a coordinate system -- in this case the cosmological frame, which are 3d-spaces ordered by the cosmological scale factor -- we can certainly talk about "a" global clock being used to compare the speed of light waves/photons versus neutrinos or protons. Other clocks are available, there's nothing special about the cosmological frame other than being calculationally useful and recoverable in principle by all physical observers that interact with electromagnetism. The fact that we are not compelled to use the cosmological frame is a manifestation of the ambiguity. That's why we are interested in coincident points: where a particle interacts with a detector, all observers can compare the particle momentum with the detector momentum, and transform the comparisons from observer to observer.
So, "a classical light wave traces out a lightlike path between creation and annihilation" is correct, and introducing "photon" into the mix doesn't change things much (there are various definitions of photon, incidentally); "we can carve up a lightlike path using a parameter conceptually comparable to proper time"; "we cannot use proper time as labels on a null geodesic as such, but we can certainly consider the limit as we take a low-mass ultra-relativistic particle's mass to zero as a decent proxy".
Your other points -- about the metric expansion and vacuum, in particular --are matters of interpretation of the mathematical details that are fine enough at this level, although they are far from the only physical interpretations available.
It makes sense if one pictures the aether as a finite substance over an infinite surface area of and in of itself. Like a limited amount of paint that gets smooshed and pushed around. Intuitively I'd postulate that the aether is an incompressible fluid at 'neutral' - a fundamental formless form with directionality. Then when the aether interacts with itself in opposing currents it creates an inertia against itself causing it to evaporate into matter all the way into planet cores, rock, water, air, then outer space. It basically becomes a pressure gradient at different critical boundaries. Makes sense to me. Can simply a lot of Science most likely, just import the electron as a monopole that builds up into hydrogen and so on. Aether splits into E- and E+ and clumps until 1836 (proton/2 918 aether e+ e-).
Can be radically simplified with Torus, Aether, and Hyperboloid fluidic models around Vortices.
One interesting question of this is if science changes once the other galaxy groups disappear. By that I mean if a human was to redo all previous experiments have we lost any data that would result in us building a different model of the universe than what we've built with the other galaxy groups present. Sure, we will have records of evidence from the past, but records of evidence from experiments that can no longer be repeated is no longer reliable data. This is ignoring the whole development of humanity and science in the intervening 150 billion years which may change what options are available to use then.
I'm also not sure if this is the most up to date model of the universe. I've recently dived deep into quantum field theory and theoretical physics (well as deep as one can without understanding a field of spinors) and I've watched a number of lectures given over the last decade that include alternate takes on expansion. I could be completely misunderstanding what was being spoken about, but it appears one possible model of reality is a sort of bubble in hyperexpanding space that gives rise to a big bang event that eventually leads to the bubble becoming hyper expanding space itself that can give rise to further bubbles.
I recommend the whole thing if you aren't familiar with Boltzmann brains and boxes, but if you want to see the issue with the current model presented in this article go to 41 minute mark and to see a possible alternate model go to about 44:50.
The galaxies of the local group won't recede from us.
Telescopes as recently as the early 20th century could not resolve the sky well enough to determine which nebulae are in the Milky Way, are satellites of the Milky Way, are in the local group, or are outside it. This lack of observation allowed the Shapeley-Curtis https://en.wikipedia.org/wiki/Great_Debate_(astronomy) to last as long as it did -- it ended with clear observations of light curves within M31 and other galaxies in the local group. Those observations will remain available for a lonnnnnnnnnnng time after there are no other galaxy groups in the view of anything in the local group.
Moreover, the relic fields (cosmic microwave background and cosmic neutrino background) will be detectable (at least in principle) as they grow ever colder in the far future, so at least the radiation and dark energy sectors can be rediscovered in the distant future when there are no highly redshifted galaxies left to see.
We are lucky to have clear views of redshifted, dimmed, smaller-on-the-sky spiral galaxies at all sorts of orientations to us (including face on). Those certainly help us to understand galactic evolution, and also provide us lots of interesting objects (gas and dust clouds, active nuclei, stars, supernovae) "in the past". In the far future, the local group will have fewer spirals, fewer stars, almost no low-metallicity stars, and so on, and recovering those from observation of what's left in the distant future will be more difficult. But around the garganguan black holes left in the local group in the very far future there will still be plenty of dust and gas and cooling stellar remnants; the occasional star will probably form, but not often. Still, the volume will be large enough far into the future that "not often" simply invites wider searches of the sky. Supernovae don't happen often, but thanks to all-sky surveys like http://www.astronomy.ohio-state.edu/~assassin/index.shtml we see A LOT more of them than could have been hoped for a hundred years ago. Far future astronomers might have a super-hyper-ultra ASAS-SN type programme to look for the earliest signs of star formation in what's left of the local group.
Of course as you note, even better would be if the far future astronomers can recover the records of earlier astronomers, rather than having to rediscover everything from scratch using the "records" provided by nature at astronomical scales.
The far far future where one cannot see distant galaxies from the local group is very very very very very much earlier than the epoch of fluctuations into Boltzmann brains, vacuum decay, etc. etc., none of which is a really reliable prediction of well-tested theory. (In fact, they are diagnostic of theoretical deficiencies: how do we prevent objects that can happen according to a theory from happening according to that theory, given that we do not see those objects? This recurs a lot at the edges of theoretical physics: in General Relativity it's the basis for developing the https://en.wikipedia.org/wiki/Energy_condition (s) for instance).
If anyone begins to feel trapped in a cage, please don't just yet. For instance, astronomers currently estimate on the order of 10¹² planets in Milky Way alone, of which up to 11 billion could be Earth-sized and orbiting Sun-like stars [1]. So there's still plenty to explore/colonize, especially if you include the few tens of neighboring galaxies.
This might be obvious to most, but at first it wasn't clear to me why in 150 billion years the vast majority of the known universe will be unreachable. The answer is in the acceleration of expansion (which the blogpost mentioned but I didn't connect the dots on at first); eventually the rate of expansion ("velocity") will greater than the speed of light.
Parsing this, I’d like to match it up with the Hubble horizon, at least for your stated condition that the velocity of expansion is greater than the speed of light: That already holds for us relative to points across the Hubble horizon. Their “past-emitted” light is still reaching us, meanwhile their “presently emitted” light never will.
I need to check the proper definitions but that’s the gist. Kinda erie!
Distant regions of the universe already are receding from us faster than light. Due to the accelerating rate of expansion, that horizon beyond which this is true is shrinking inwards towards us, so that in 150 bn years even galaxies outside our local group will be beyond that horizon.
Even the closest galaxies are so far away, its unlikely we can reach them even with a warp drive.
However, it's possible that the accelerated expansion might prove an aberration that only makes the galaxies seem as if they are "quickly" moving beyond our reach. In which case, what a relief! LOL
I don’t think human science is remotely advanced enough to make one million year predictions, let alone 150 billion. We still have the giant dark matter fudge factor that eludes experimental verification, we’re still inventing new particles to make quantum math work, we still can’t come up with any theory that explains all our observations.
Extreme predictions require extreme precision of data and model, and it seems that our models have already hit their limits on much more pedestrian problems. Like Newton’s physics, they are a function that fits reality well only within a certain range of values.
This is not an anti-science sentiment, by the way, just maintaining scope and margins of error.
We're talking about the expansion of the Universe here. This expansion has 14 billion years of evidence in the observable Universe to support it.
Experimental evidence has shown this expansion is increasing not decreasing.
This isn't to say that the expansion won't slow, stop or even reverse but that claim requires a testable theory. The Universe is expanding based on all our understanding and all the experimental evidence we have to date is not an equivalent position to "well, we really don't know what will happen in a billion years".
One wonders if it is some fundamental assumption that limits further understanding of the Universe. Is it our intuitive relationship with numbers, time, and physical space that is limiting? We all grow up with societally imposed relationships between numbers, time, and physical space, but are these learned or do they actually reflect objective reality. The frameworks derived from these fundamental assumptions are excellent for making predictions and are testable within those frameworks.
> One wonders if it is some fundamental assumption
One could even go further...
Russell once compared the movement of the planets around the sun with people carrying torchs and walking around a mountain during the night. Because it is dark, we don't see that people are carrying those torchs and we are wondering why those lights move in such strange ways. Then then sun rises (Einstein presents his relativity theory) and we understand.
We hope such an explanation also exists for the entire universe and its physical laws. But could it be that the explanation is beyond the capabilities of the human brains? I am not talking about complexity or problem size. That could be solved with computers. Maybe there is an elegant, obvious and compact explanation why the universe behaves like it does, but it so much beyond what we consider as logical that if an alien had written it down in a book, we would not even realize what the book is about. Like trying to explain to a dog the concepts of "yesterday" and "tomorrow". Is there a reason to assume that our ape brains are able to understand everything?
That humans exist a mere 14 billion years into a universe that will exist for many quintillions of years is so exceedingly improbable that we may very well be the first sentient life. This is one of my favorite explanations for the Fermi Paradox.
My other favorite is that there is no Paradox. The signs of alien civilizations are everywhere,
but humans cannot perceive or understand it
I believe that spacefaring life is relatively rare in our neighbourhood to the point where it's entirely possible we're the only one.
As for signs of alien civilization being everywhere, here I have to respectfully disagree. I'll summarize why.
I firmly believe the likely future of humanity is not on planets but in orbitals. Planets are a great way of storing mass. They're a poor way of creating living area. I think the estimate is that 1% of the mass of Mercury could fully encompass the Sun in a Dyson Swarm of orbitals with no material stronger than stainless steel. I believe humanity will be capable of this within 1,000 years and that's super-conservative.
Objects in space that absorb Solar radiation heat up. The only way of releasing that is by radiating it away. That radiation is of a wavelength solely determined by the temperature of the object. For any reasonable temperatures, that's in the infrared spectrum.
So a Dyson Swarm will have a very particular spectrum. They'll be IR lanterns. Sure one star we may not see out of a whole galaxy but the barrier between doing this to one star to the whole galaxy is only a matter of a relatively short amount of time (cosmologically speaking). One million years for the Milky Way (out of 10B+ years).
There are a bunch of assumptions in this and each would be their own topic. But as a solution to the Fermi Paradox, it only takes one.
Recycling waste heat is a common objection but that just reduces your emissions, it doesn't eliminate them. If it did, that would break thermodynamics.
This argument is strengthened by this article: over time more mass is becoming unreachable. For truly long-lived civilizations it makes the most sense to grab as much mass as you can and sequester it. Again, it only takes one.
Also for most of those 14 bn years heavy elements weren't abundant enough for many rocky planets to form. It took several generations of stars and cycles of supernovas to end up with the heavy element rich environment we find ourselves in. Our star's siblings might be the first generation with a rich enough planetary environment and abundance of heavy elements to support life and technological civilizations.
Aren't most of those quintillions spent during the heat death when there no real available energy and everything is nothing more than a barely held together soup of matter?
Fear of individual/personal death is somewhat mitigated because we know other beings will survive us. Now imagine if all the life forms and stars destiny is an eternal and permanent death, all of them, just like forever. Feels as staring at an abyss.
It's a reasonable concern. I think we'll probably survive indefinitely, after all if Mars can be colonised it's hard to imagine we could screw up Earth so badly it's less habitable than that.
However, the planetary environment could become so hostile that it takes so much effort to maintain civilization that our capabilities become severely limited. So my concern isn't so much us dying out, as we become doomed to a fairly marginal existence, incapable of interesting achievements such as interstellar exploration.
so a star A at the distance 13.7B ly from us is running away with "c". Until you believe that today is an unique moment in time, we can expect that in 100M years a star B at the distance 13.8B ly will be also moving with "c" away from us.
If we allow for accelerated expansion of the Universe, then it means that the star B is closer to us then the star A (with the star A moving away faster than "c" in 100M years from today - i.e. accelerated expansion) - i.e. today the star B is closer to us than 13.7B ly and runs away slower then "c", and thus in the next 100M years the star B would be expected to make a journey longer than 100M ly - from some point today closer to us than 13.7B ly to the point at 13.8B ly in 100M years - i.e. it would need to run away faster then "c" during those 100M years which is clearly a contradiction.
That leads to conclusion that whatever speed a star runs away with today, it will be running away with the same speed tomorrow - that means decreasing Hubble constant - which is normal giving the same energy being spread in a bigger and bigger space. And when it comes to observing the observable Universe - the star A will always run away with "c", and whatever stars B are closer to us then the star A ( and thus running away slower than "c" today), those stars B will never run away faster than "c" - thus the light from them will always come to us.
Note: the stars which are farther away moving faster isn't acceleration of expansion. Acceleration of expansion would be if the same stars were running away faster tomorrow than today - which would clearly lead to the contradiction pointed above. Hubble constant decreasing with time is the only way to avoid that contradiction. That also probably explains different Hubble constant values we currently have - as they are obtained using signals from different points in the Universe history.
[+] [-] strange_things|5 years ago|reply
* spacetime itself literally “creates itself out of nothingness”. We say it “stretches”, but don't picture a string of cheese “thinning out” but rather something of constant density being continuously expanded.
* local groups (of galaxies) are gravitationally bound i.e they won’t scatter around and they attract each other with gravity so they will stay together even though more spacetime is created inside of them.
* in the empty “space” between local groups something (dark matter? Dark energy? ) causes this steady continuous creation of new spacetime
* this creation of spacetime is itself accelerating, but this is not speed, so distant local groups are not “moving away from us” but are being “displaced away from us”. Hence the displacement rate is allowed to be faster than the speed of light (since it’s not a speed)
* every single object in the universe moves at exactly and precisely the speed of light c through spacetime. Objects at rest (a completely stationary astronaut stranded in the empty space between local groups) is also moving at the speed of light in spacetime: he is moving only through time.
* A photon is moving only through space and has no time
Sources: The Elegant Universe, Brian Greene, ScienceClic YouTube, and my memory of these
[+] [-] raattgift|5 years ago|reply
No. For all practical purposes (FAPP), galaxy clusters are not expanding. Swiss cheese cosmologies where Lemaître-Tolman-Bondi (LTB) solution vacuoles are embeded within the usual Friedmann-Lemaître-Robertson-Walker (FLRW) cosmological solution generate observables that closely match luminosity-redshift and angular diamater-redshift relations for active galactic nuclei (SN Ia supernovae) extremely well.
LTB is a contracting spacetime -- there simply is no expansion within the vacuole, mathematically. Adding in expansion gives you something other than LTB, which leads to a different redshift for AGNs and SNs in the same cluster when correcting for peculiar motion within the cluster. The evidence favours LTB, and the physical interpretation is that the galaxy clusters and the gas surrounding them are condensing into a point over cosmologically long periods, while the galaxy clusters separate according to the surrounding expanding FLRW solution (which captures the Lyman-alpha forest, and the CMB redshift, among other observables).
Moreover, at the scale of the solar system, there is simply no measurable metric expansion. FAPP the solar system matches an LTB solution.
Since LTB solutions are asymptotically flat, we can nest them hierarchically, using an Israel-Darmois thin shell junction to capture how radiation and other matter moves between the nested LTB solutions and between the "outer" LTB solution for the vacuole in general and the FLRW solution. This works reasonably well, and there have been good numerical approaches since the mid-1990s (Musgrave, 1996).
> every single object in the universe moves exactly and precisely at the speed of light through spacetime
No. In General Relativity velocity vectors are ambiguous except at a coincident point. At a point occupied by a massless particle or wave (a photon or classical light, for instance) and one occupied by a massive particle, one can clearly and unambiguously demonstrate that the former is faster.
In a Lorentzian manifold, like our universe, there is a clear distinction between lightlike and timelike geodesics; light couples to one but not the other, and anything massive couples only to timelike (and not lightlike) geodesics. This is an inevitable consequence of the 1+3 dimensional geometry: https://en.wikipedia.org/wiki/Causal_structure#Tangent_vecto...
Light (and other massless waves or particles, those with "null" mass) simply do not have access to the same tangent vectors as massive objects, and vice-versa. This is extremely well supported by experiment and observation: https://en.wikipedia.org/wiki/Modern_searches_for_Lorentz_vi...
> a photon is moving only through space and has no time
A classical light wave is either at a point in spacetime or it is not. Quantum mechanical photons do not much differ.
"Movement through spacetime" is because you have chosen to split up the whole spacetime into spaces oriented by time, turning a worldline/worldtube into a set of time-ordered points. But your choice of splitting is not the right splitting any more than anyone elses. I can choose to split the universe into spaces oriented along the path of a photon emitted from a distant galaxy to my CCD detector. Or from the lightbulb across the room to my eyeball. It was obviously still in the distant galaxy, or in the filament, in my idea of the past, and a bit less past it was in flight between the two (if you believe in physical realism). That is, applying our favoured splitting to a system does not change the underlying worldtube, only how we label any given part of it.
One can carve a lightlike geodesic up however you like, for example, by assigning any sort of ordered labels between the emission and absorption. Done carefully, you can use the affine parameter which has some useful properties for calculating things like redshift (details: https://physics.stackexchange.com/questions/17509/what-is-th... ).
What one cannot do is just use the same proper time as we can on timelike geodesics. That's because proper time is defined for timelike geodesics, and those cannot be occupied by light.
Moreover, we return to the point above about ambiguity. By fixing a coordinate system -- in this case the cosmological frame, which are 3d-spaces ordered by the cosmological scale factor -- we can certainly talk about "a" global clock being used to compare the speed of light waves/photons versus neutrinos or protons. Other clocks are available, there's nothing special about the cosmological frame other than being calculationally useful and recoverable in principle by all physical observers that interact with electromagnetism. The fact that we are not compelled to use the cosmological frame is a manifestation of the ambiguity. That's why we are interested in coincident points: where a particle interacts with a detector, all observers can compare the particle momentum with the detector momentum, and transform the comparisons from observer to observer.
So, "a classical light wave traces out a lightlike path between creation and annihilation" is correct, and introducing "photon" into the mix doesn't change things much (there are various definitions of photon, incidentally); "we can carve up a lightlike path using a parameter conceptually comparable to proper time"; "we cannot use proper time as labels on a null geodesic as such, but we can certainly consider the limit as we take a low-mass ultra-relativistic particle's mass to zero as a decent proxy".
Your other points -- about the metric expansion and vacuum, in particular --are matters of interpretation of the mathematical details that are fine enough at this level, although they are far from the only physical interpretations available.
[+] [-] pyinstallwoes|5 years ago|reply
Can be radically simplified with Torus, Aether, and Hyperboloid fluidic models around Vortices.
[+] [-] macintux|5 years ago|reply
https://news.ycombinator.com/item?id=26732996
[+] [-] WindyLakeReturn|5 years ago|reply
I'm also not sure if this is the most up to date model of the universe. I've recently dived deep into quantum field theory and theoretical physics (well as deep as one can without understanding a field of spinors) and I've watched a number of lectures given over the last decade that include alternate takes on expansion. I could be completely misunderstanding what was being spoken about, but it appears one possible model of reality is a sort of bubble in hyperexpanding space that gives rise to a big bang event that eventually leads to the bubble becoming hyper expanding space itself that can give rise to further bubbles.
Here is one talk.
https://www.youtube.com/watch?v=jhnKBKZvb_U
I recommend the whole thing if you aren't familiar with Boltzmann brains and boxes, but if you want to see the issue with the current model presented in this article go to 41 minute mark and to see a possible alternate model go to about 44:50.
[+] [-] raattgift|5 years ago|reply
Telescopes as recently as the early 20th century could not resolve the sky well enough to determine which nebulae are in the Milky Way, are satellites of the Milky Way, are in the local group, or are outside it. This lack of observation allowed the Shapeley-Curtis https://en.wikipedia.org/wiki/Great_Debate_(astronomy) to last as long as it did -- it ended with clear observations of light curves within M31 and other galaxies in the local group. Those observations will remain available for a lonnnnnnnnnnng time after there are no other galaxy groups in the view of anything in the local group.
Moreover, the relic fields (cosmic microwave background and cosmic neutrino background) will be detectable (at least in principle) as they grow ever colder in the far future, so at least the radiation and dark energy sectors can be rediscovered in the distant future when there are no highly redshifted galaxies left to see.
We are lucky to have clear views of redshifted, dimmed, smaller-on-the-sky spiral galaxies at all sorts of orientations to us (including face on). Those certainly help us to understand galactic evolution, and also provide us lots of interesting objects (gas and dust clouds, active nuclei, stars, supernovae) "in the past". In the far future, the local group will have fewer spirals, fewer stars, almost no low-metallicity stars, and so on, and recovering those from observation of what's left in the distant future will be more difficult. But around the garganguan black holes left in the local group in the very far future there will still be plenty of dust and gas and cooling stellar remnants; the occasional star will probably form, but not often. Still, the volume will be large enough far into the future that "not often" simply invites wider searches of the sky. Supernovae don't happen often, but thanks to all-sky surveys like http://www.astronomy.ohio-state.edu/~assassin/index.shtml we see A LOT more of them than could have been hoped for a hundred years ago. Far future astronomers might have a super-hyper-ultra ASAS-SN type programme to look for the earliest signs of star formation in what's left of the local group.
Of course as you note, even better would be if the far future astronomers can recover the records of earlier astronomers, rather than having to rediscover everything from scratch using the "records" provided by nature at astronomical scales.
The far far future where one cannot see distant galaxies from the local group is very very very very very much earlier than the epoch of fluctuations into Boltzmann brains, vacuum decay, etc. etc., none of which is a really reliable prediction of well-tested theory. (In fact, they are diagnostic of theoretical deficiencies: how do we prevent objects that can happen according to a theory from happening according to that theory, given that we do not see those objects? This recurs a lot at the edges of theoretical physics: in General Relativity it's the basis for developing the https://en.wikipedia.org/wiki/Energy_condition (s) for instance).
[+] [-] gattr|5 years ago|reply
[1] https://en.wikipedia.org/wiki/Milky_Way#Contents
[+] [-] __blockcipher__|5 years ago|reply
[+] [-] tobmlt|5 years ago|reply
[+] [-] amelius|5 years ago|reply
What will that do to the speed of light as measured on Earth?
And will this affect, say, the correct operation of computers?
[+] [-] simonh|5 years ago|reply
Hurry up Elon, we're running out of time!
[+] [-] transfire|5 years ago|reply
However, it's possible that the accelerated expansion might prove an aberration that only makes the galaxies seem as if they are "quickly" moving beyond our reach. In which case, what a relief! LOL
[+] [-] btilly|5 years ago|reply
[+] [-] twhb|5 years ago|reply
Extreme predictions require extreme precision of data and model, and it seems that our models have already hit their limits on much more pedestrian problems. Like Newton’s physics, they are a function that fits reality well only within a certain range of values.
This is not an anti-science sentiment, by the way, just maintaining scope and margins of error.
[+] [-] cletus|5 years ago|reply
Experimental evidence has shown this expansion is increasing not decreasing.
This isn't to say that the expansion won't slow, stop or even reverse but that claim requires a testable theory. The Universe is expanding based on all our understanding and all the experimental evidence we have to date is not an equivalent position to "well, we really don't know what will happen in a billion years".
[+] [-] mshumi|5 years ago|reply
[+] [-] tralarpa|5 years ago|reply
One could even go further...
Russell once compared the movement of the planets around the sun with people carrying torchs and walking around a mountain during the night. Because it is dark, we don't see that people are carrying those torchs and we are wondering why those lights move in such strange ways. Then then sun rises (Einstein presents his relativity theory) and we understand.
We hope such an explanation also exists for the entire universe and its physical laws. But could it be that the explanation is beyond the capabilities of the human brains? I am not talking about complexity or problem size. That could be solved with computers. Maybe there is an elegant, obvious and compact explanation why the universe behaves like it does, but it so much beyond what we consider as logical that if an alien had written it down in a book, we would not even realize what the book is about. Like trying to explain to a dog the concepts of "yesterday" and "tomorrow". Is there a reason to assume that our ape brains are able to understand everything?
[+] [-] rendall|5 years ago|reply
My other favorite is that there is no Paradox. The signs of alien civilizations are everywhere, but humans cannot perceive or understand it
[+] [-] cletus|5 years ago|reply
As for signs of alien civilization being everywhere, here I have to respectfully disagree. I'll summarize why.
I firmly believe the likely future of humanity is not on planets but in orbitals. Planets are a great way of storing mass. They're a poor way of creating living area. I think the estimate is that 1% of the mass of Mercury could fully encompass the Sun in a Dyson Swarm of orbitals with no material stronger than stainless steel. I believe humanity will be capable of this within 1,000 years and that's super-conservative.
Objects in space that absorb Solar radiation heat up. The only way of releasing that is by radiating it away. That radiation is of a wavelength solely determined by the temperature of the object. For any reasonable temperatures, that's in the infrared spectrum.
So a Dyson Swarm will have a very particular spectrum. They'll be IR lanterns. Sure one star we may not see out of a whole galaxy but the barrier between doing this to one star to the whole galaxy is only a matter of a relatively short amount of time (cosmologically speaking). One million years for the Milky Way (out of 10B+ years).
There are a bunch of assumptions in this and each would be their own topic. But as a solution to the Fermi Paradox, it only takes one.
Recycling waste heat is a common objection but that just reduces your emissions, it doesn't eliminate them. If it did, that would break thermodynamics.
This argument is strengthened by this article: over time more mass is becoming unreachable. For truly long-lived civilizations it makes the most sense to grab as much mass as you can and sequester it. Again, it only takes one.
[+] [-] simonh|5 years ago|reply
[+] [-] taberiand|5 years ago|reply
[+] [-] andrewstuart|5 years ago|reply
[+] [-] lmilcin|5 years ago|reply
And most of what is still visible is also completely unreachable.
[+] [-] mensetmanusman|5 years ago|reply
[+] [-] nahuel0x|5 years ago|reply
[+] [-] markus_zhang|5 years ago|reply
[+] [-] mike_ivanov|5 years ago|reply
[+] [-] hankchinaski|5 years ago|reply
[+] [-] mseepgood|5 years ago|reply
[+] [-] simonh|5 years ago|reply
However, the planetary environment could become so hostile that it takes so much effort to maintain civilization that our capabilities become severely limited. So my concern isn't so much us dying out, as we become doomed to a fairly marginal existence, incapable of interesting achievements such as interstellar exploration.
[+] [-] trhway|5 years ago|reply
If we allow for accelerated expansion of the Universe, then it means that the star B is closer to us then the star A (with the star A moving away faster than "c" in 100M years from today - i.e. accelerated expansion) - i.e. today the star B is closer to us than 13.7B ly and runs away slower then "c", and thus in the next 100M years the star B would be expected to make a journey longer than 100M ly - from some point today closer to us than 13.7B ly to the point at 13.8B ly in 100M years - i.e. it would need to run away faster then "c" during those 100M years which is clearly a contradiction.
That leads to conclusion that whatever speed a star runs away with today, it will be running away with the same speed tomorrow - that means decreasing Hubble constant - which is normal giving the same energy being spread in a bigger and bigger space. And when it comes to observing the observable Universe - the star A will always run away with "c", and whatever stars B are closer to us then the star A ( and thus running away slower than "c" today), those stars B will never run away faster than "c" - thus the light from them will always come to us.
Note: the stars which are farther away moving faster isn't acceleration of expansion. Acceleration of expansion would be if the same stars were running away faster tomorrow than today - which would clearly lead to the contradiction pointed above. Hubble constant decreasing with time is the only way to avoid that contradiction. That also probably explains different Hubble constant values we currently have - as they are obtained using signals from different points in the Universe history.