Light is a form of energy.
The further light has to travel, the more energy it loses across the distance.
You can compare it to a full sized pencil and a half worn down pencil.
The full sized pencil will be able to draw a longer line than the half sized one.
Light is not infinite, it is generated from a source. The stronger the source, the brighter the light will be (check lumens on your lightbulb for a scale)
Many rays of light are emitted from a source, at many different angles. The further away you are from the source, the further apart the rays will have gotten, meaning fewer will be on a path that intersects your eye/camera. All of the light still exists, it’s just nowhere near you.
Over very, very large distances light does truly lose energy due to cosmological redshift.
If you mean, why light dissipates over *distance* it’s because of the inverse square law.
Imagine you have a light bulb and a square light sensor that is 1 meter x 1 meter and you put the sensor 1 meter away from the bulb and it measures some number.
Next, imagine 4 lines going from the light bulb to each corner of your sensor. It forms a pyramid shape, right?
Let’s leave those lines where they are, but we back up the sensor to 2 meters away. And now it’s measuring 1/4th of what we had before… why is that??
Because while we doubled the distance, that pyramid doubled both in width & height. Our sensor stayed a 1 meter square, but all those photons are now spread over a 2 meter x 2 meter square.
If we moved out to 3 meters from the bulb, we would measure 1/9th the amount of light because those photons are spread over a 3×3 meter area.
In theory, laser beams can be made to continue in a beam of constant diameter for infinite distance. Think of it like two parallel lines, representing both sides of the beam.
In practice, they will disperse. Those two lines will not be perfectly parallel. Even if you imagine those lines converging, they will meet at a focus point, and diverge from then.
Given the enormous distances in space, this does not require much of an angle between the two lines.
This all assumes a perfect vacuum. Add *any* atmosphere and you will induce scattering.
There are many ways to approach it.
The simplest is probably [Huygens–Fresnel principle](https://en.wikipedia.org/wiki/Huygens%E2%80%93Fresnel_principle). In this model any light wave can be thought of a sum of infinitely many point light sources. The sum of these sources if same as the wave you want to model.
Now if you have infinitely wide plane wave you can have these point sources on its infinite width and everything is nice. The plane wave will stay same and propagate to infinity without spreading.
But our wave sources rarely are infinitely wide. You have some sort of aperture.
Image: https://en.wikipedia.org/wiki/File:Refraction_on_an_aperture_-_Huygens-Fresnel_principle.svg
Now the point wave sources are only on the width of the aperture. If you sum out the waves from the point sources you will see that the wave is no longer neat plane wave. Instead it spreads out.
How exactly it spreads out depends on the shape of waves and shape(s) of the aperture(s). But no matter how you choose the waves the light will always spread out after passing through an aperture.
Latest Answers