I read laser beams get wider, like a few feet wide by the time they hit the moon, Is that a manufacturing limit, or just something about the physics of laser light? Is a perfect laser beam that doesn’t get wider possible?

534 views

I read laser beams get wider, like a few feet wide by the time they hit the moon, Is that a manufacturing limit, or just something about the physics of laser light? Is a perfect laser beam that doesn’t get wider possible?

In: 5280

13 Answers

Anonymous 0 Comments

[deleted]

Anonymous 0 Comments

It’s a property inherent to laser beams. Most lasers have a shape called a Gaussian and that type of beam has a properly known as divergence that is basically the angle at which the beam spreads as it travels. The divergence can be controlled but has a minimum that is determined by the wavelength and the beam diameter.

So yes, you can get small divergence but it requires a large beam size and challenging optics, but it’s always going to have some spread.

Anonymous 0 Comments

A laser beam pointed at the moon (from earth’s surface) will get much wider because the photons are hitting other particles as it goes – air, dust, etc – causing the light to diffuse. So no, we can’t manufacture a laser that won’t get wider when used practically in our environment.

~~Even in a perfect vacuum with no other particles, the photons themselves will interfere with each other over a long distance causing the beam to widen. But this is quite negligible, so the beam will indeed stay mostly the same size for a long distance.~~

Edit: see u/jaa101 comment below on what happens in a vacuum, who probably knows more than I do.

Anonymous 0 Comments

>Is a perfect laser beam that doesn’t get wider possible?

No, at every interface or slit light “bends a corner”; we call this diffraction. A perfectly focused light is limited by the diffraction it experiences, which we call the diffraction limit.

Why light diffracts can only be explained by wave theory, and you can observe it by splashing water waves through a slit; it’s simply how waves “do”, and yes it can be a pain when building optical systems.

This same limit applies to all optical systems like microscopes, fiber optics, telescopes, etc., not just lasers. The fundamental thing is: we need to guide light, so we use optical elements, and slits and interfaces will always cause diffraction.

However, lasers continue to have a uniquely low divergence, which is why we use them nevertheless.

We use several numbers to characterize how “good” the laser beam is, but they mostly derive from the M^(2) number; if it’s 1, then you have a beam so good that if not for the diffraction at the slit, it would not diverge and remain a “straight beam” forever (i.e diffraction limited beam). If M^(2) is larger than 1, then it’s not ideal, and usually it isn’t, we can have perfectly optimal systems with larger values for M2.

Anonymous 0 Comments

Good question.

Obviously for focused laser beams, they are going to de-cohere past their focal point. That is a manufactured property.

But in the specific case of an un-focused collimated beam, like one you might shine on the moon, it really is actually about the physics of the laser light. For a given aperture size and emission wavelength, you cannot do better than a certain amount of decoherence no matter what you do or how perfect you make your laser.

However, that is for a given aperture and wavelength. So, if you are sneaky-minded, you might say to yourself, “So what if you had a really big aperture and a really short wavelength?” And in that sense, you could say that a “perfect” laser beam with no decoherence would be possible if you had an infinitely large aperture and an infinitely small wavelength.

In other words, not actually physically possible, but a cute mathematical thought experiment.

Anonymous 0 Comments

Any beam of light will end spreading out as a spherical wave. Any beam of electromagnetic radiation, as a matter of fact.

Close to the emitter, a laser beam can be modeled as a perfectly cylindrical beam, without any divergence. That’s what’s called the [near field condition](https://en.wikipedia.org/wiki/Near_and_far_field). However, farther away it will start behaving like any other source of light. That’s the far field.

Anonymous 0 Comments

NASA frequently fires a laser at the moon. The Apollo astronauts left a reflector at one of the landing sites, and NASA used it to measure the distance between Earth and the Moon. The laser is 2 kilometers across by the time it hits the moon, and *25 kilometers* wide by the time it gets back.

Anonymous 0 Comments

It is caused by the wave nature of light. The smaller the diameter of the beam, the larger the spread.

Anonymous 0 Comments

Hi.
To answer very briefly: It is about the physics of light (or waves in general).
Even if you have a perfect laser, it will always diverge. As do waves on water when you drop something into it. But you can decrease this divergence by making your laser beam larger or by making the color more “blue” than “red”.

Details:
It doesnt matter what is your starting point really. You can start with a small laser pointer for instance. Then you tak two lenses. One lens that will make the beam more divergent and the other which will focus it. If you place them at a correct distance from each other, you will obtain a larger laser beam which will be much less divergent than the laser pointer output. We call these two lenses a “telescope” and the laser beam that diverges very little is called a “collimated” beam.
Also, longer wavelengths diverge more, but to explain this I would really have to talk some math.

Source: PhD in laser physics

Anonymous 0 Comments

It’s not just about getting wider- can’t you get a really good lens and focus it at some ideal distance and get it to a zero-width point?

Nope. [Diffraction limits](https://courses.physics.illinois.edu/phys214/fa2013/lectures/lecture5.pdf) show the limit of all optical systems. Even if the lenses are “perfect”.

It’s the same rule behind Rayleigh Criterion limiting the angular resolution of a telescope.

The larger the initial beam, nonintuitively, makes the minimum spot size smaller. Starting with a beam twice as wide and using a lens twice as wide will reduce the minimum spot size by 50% for a given distance. But then the beam is starting out wide dia and gets smaller and smaller to its minimum at the focal point, then gets wider. So if you didn’t know what distance to focus at exactly, the beam is even less Star-Wars-ish and makes a more prominent bowtie/hourglass shape.