If there is a beam of light from a source in space a million light years away travelling towards the earth, does gradually start to become brighter as it gets closer and fades away as it moves further?

326 viewsOtherPhysics

If that is true and the source suddenly disappears, does it mean there is an “arrow” like of light travelling in space until it “goes away” ?

In: Physics

4 Answers

Anonymous 0 Comments

No, the intensity of light decreases over distance from the point of origin.

Think about what happens if you throw a rock into a pond. When the ripple starts to spread, it starts off large around where the rock hits the water, and as the ripple spreads out, the height of it gets lower and lower.

It’s the same with light, as the beam of light shines from the source, the area that the beam covers gets larger and larger, but since the beam has the same amount of light, it gets dimmer and dimmer as the area gets larger and larger.

Anonymous 0 Comments

From a million light years away, it’s a simple answer – the beam spreads out and gets fainter and fainter in any particular bit of space it’s passing through. Just like shining a flashlight on something nearby shows a much brighter and tighter beam than shining it on something far away. So if the source suddenly turned off, we’d see the final bit of its light (very faint and spread out) one million years later.

If the source is much, much farther away (well beyond the Milky Way’s gravitational influence), then not only does the beam get dispersed, it gradually gets redder and redder and if it started far enough away, the light moves to the infrared and becomes invisible to the eye. And if it’s a source at the edge of a black hole, it’s intensity goes down to essentially undetectable levels and it gets redder and redder as it travels.

Anonymous 0 Comments

> does gradually start to become brighter as it gets closer and fades away as it moves further?

This is a flawed premise. You don’t “see the beam” like an object that is coming closer and passing you. What you see is the light itself, the photons impacting your retina. So how bright the light appears is its intensity when it reaches your eyes.

Anonymous 0 Comments

In a perfect vacuum, the intensity of a collinear beam of light (like a laser) should not reduce. However, interstellar space is not a perfect vacuum. Any particles/dust in between act as attenuation factors. For a point source, releasing a finite number of photons per unit time in all directions, the intensity reduces with distance squared along with further attenuation due to interstellar dust.

In the scenario where the source is moving relative to tbe observer, the light gets blue shifted (frequency increases/wavelength reduces) if moving towards the observer and redshifted (frequency reduces/wavelength increases) if moving away from the observer. This effect is called the doppler effect. Assuming that interstellar dust is homogenous, it also follows that for a collinear beam, the intensity is inversely proportional to the distance.

As per your final question, it is true. For example, picture a supernova in a distant galaxy. The host star would have been long dead, yet we could still see light rays from the supernova event. Sometimes, they are red or blue shifted as well. But eventually, some of that light gets scattered by interstellar dust.