(no title)
zarmin | 4 months ago
to us, the sun appears to be the size of, let's say, a quarter held at arm's length. this is at 93M miles (1AU, or ~8 light minutes) distance. if we moved the sun 100 miles away from earth, it would take up the entire sky. now in the other direction, if we doubled the distance, to 2AU, it would appear to us as half its normal size and 1/4 as bright (irradiance follows inverse square law). at 3AU the sun would be 1/9 as bright and 3x smaller than a quarter. at 100AU, we're talking about brightness of 1/100^2 (one ten-thousandth) the sun's apparent brightness. with me so far?
Sirius A: the brightest star we can see; 25x more luminous than the sun; 2x the size of the sun; 8.6 light YEARS distance (544,000AU) from earth.
if we moved the sun to the same distance as Sirius A, it would appear 296 BILLION times dimmer and 544,000 times smaller. yet Sirius A is easily visible - the brightest star in our sky - despite being only 25x more luminous and 2x larger.
do you see the discrepancy? 25x more luminous doesn't compensate for a 296-billion-fold brightness loss. The numbers we are given don't make sense, not even close. (and this is without considering diffusion, which would make the discrepancy even worse.) i'm not proposing an explanation or a modification to the model, i just think the data don't make sense.
hmorgan|4 months ago
In the case of your thought experiment, the critical factor is that our eyes are able to observe and adjust to a very wide range of brightness in different conditions. Sirius A really is billions of times dimmer than the sun to our eyes (hard to find a good reference for that, but this mentions it: https://ecampus.matc.edu/mihalj/astronomy/test5/stellar_magn...).
zarmin|4 months ago
Here, the math doesn't check out. That's my point.
I'm not saying "it seems like stars should be invisible but they're not", Im showing that inverse square law - which we can verify at human scales - predicts invisibility at stellar distances, and the proposed compensation (25x more luminosity) is insufficient by orders of magnitude.
Sirius is "billions of times dimmer" than the sun to our eyes IF you mean the Sun as seen from Earth versus Sirius as seen from Earth. But that's not the comparison. The comparison is:
Sun moved to 544,000 AU (Sirius's distance): 296 billion times dimmer than Sun at 1 AU Sirius at 544,000 AU: 25x brighter than that
25x doesn't bridge a 296-billion-fold gap, plus the eye's dynamic range is irrelevant; we're comparing what brightness should reach the eye versus what compensation the model claims.
If your claim is "the eye can see across many orders of magnitude, so even though the Sun would be invisible at stellar distances, Sirius being slightly brighter makes it visible," then do the actual calculation. Show that 25x more luminosity produces enough photons to cross the detection threshold. Because the math I'm showing says it doesn't.
You're assuming the model works and looking for why my intuition is wrong. I'm showing the model's numbers are internally inconsistent. Those aren't the same thing.
>I've found that when I have a thought that seems to contradict the "established" model of the world, I tend to just be missing some critical factor.
Does it bother you that to make relativity work, they had to invent dark matter and dark energy - 96% of the universe's mass-energy - as fudge factors? At what point does "missing a critical factor" become "the model requires constant patching to match observations"?
GolfPopper|4 months ago
These are all numbers you just provided, with no source for them.
But even using your numbers, 300 billion is 3x10^11. The Sun provides about 10^5 lux, while starlight overall provides about 10^-4 lux[1], which is a difference of 10^9, meaning the difference between "all the starlight on a dark night" and "just the starlight from Sirius" would be around 10^2, which... seems about right?
1. https://en.wikipedia.org/wiki/Orders_of_magnitude_%28illumin...
zarmin|4 months ago
You're comparing the Sun's illuminance at Earth (10^5 lux at 1 AU) to all starlight combined (10^-4 lux), then trying to work backward to what a single star should provide. That's not how this works.
The question isn't "what's the ratio between sunlight and all starlight." The question is: what happens when you move the Sun to stellar distances using inverse square law?
At 1 AU: ~10^5 lux
At 544,000 AU: 10^5 / (544,000)^2 = 10^5 / 3×10^11 ≈ 3×10^-7 lux
That's the Sun at Sirius's distance. Multiply by 25 for Sirius's actual luminosity: ~7.5×10^-6 lux.
Your own Wikipedia source says the faintest stars visible to naked eye are around 10^-5 to 10^-4 lux. So we're borderline at best, and that's with the 25× boost.
But moreover, you said "the difference between all starlight and just Sirius would be around 10^2." There are ~5,000-9,000 stars visible to the naked eye. If Sirius provides 1/100th of all visible starlight, and there are thousands of other stars, the math doesn't work. You can't have one star be 1% of the total while thousands of others make up the rest - unless most stars are providing almost nothing, which contradicts the "slightly brighter" compensation model.
Address the core issue: inverse square law predicts invisibility. The 25× luminosity factor is insufficient compensation. Citing aggregate starlight illuminance doesn't resolve this.
ceejayoz|4 months ago
We can spot a single photon in the right conditions. https://www.nature.com/articles/ncomms12172