In a perfect vacuum, the intensity of a collinear beam of light (like a laser) should not reduce. However, interstellar space is not a perfect vacuum. Any particles/dust in between act as attenuation factors. For a point source, releasing a finite number of photons per unit time in all directions, the intensity reduces with distance squared along with further attenuation due to interstellar dust.
In the scenario where the source is moving relative to tbe observer, the light gets blue shifted (frequency increases/wavelength reduces) if moving towards the observer and redshifted (frequency reduces/wavelength increases) if moving away from the observer. This effect is called the doppler effect. Assuming that interstellar dust is homogenous, it also follows that for a collinear beam, the intensity is inversely proportional to the distance.
As per your final question, it is true. For example, picture a supernova in a distant galaxy. The host star would have been long dead, yet we could still see light rays from the supernova event. Sometimes, they are red or blue shifted as well. But eventually, some of that light gets scattered by interstellar dust.
Latest Answers