Bare with me on this as clearly there is something fundamentally wrong with my understand of light particles, distance and stars but should it not be case that sometimes you should not be able to see them.
Since light travels in a straight line (mostly), and their distance are massive and my eye is so very small the tiniest of angles from which the particle leaves the star would become ernomous variations by the time it reached me.
With that in mind, even with the insane number of particles being released, shouldn’t they become so wildly diffuse and spread out that they become to faint to detect or diffuse enough that I see the star then move 2 feet away and don’t.
I guess an anology would be that a torch works fine on a wall 10 feet away but won’t light up a spot a 100 feet away even though all the particles are travelling in a straight line.
If I can see a star from every single position on my side of the planet how isn’t that lighting up the whole sky or are a few particles enough to make my retina work and see a very small point of light.
Thanks
In: Physics
The eye is not a camera, it doesn’t work like a camera.
To “light up the sky” starlight (or moonlight for that matter) has to interact with the parts of the atmosphere that aren’t between your eye and the light source. When that happens, with clouds on a moonlit night for example, you might see moonlight diffusely from the cloud but people standing in the “shadow” of that cloud don’t see the moon as a result. [sample](https://news.yale.edu/sites/default/files/styles/featured_media/public/ynews-238597223.jpg?itok=incx5bBX&c=a75e254fe1da31f2732f6b0d7bce1413 )
The torch analogy isn’t completely wrong, the light intensity varies as 1 over the square of the distance. But, stars are insanely bright, and the ones we can see are even brighter than the Sun.
Latest Answers