– How detail are lost as a factor of distance?

474 views

What I mean is this. Suppose that I am 20km far from the moon. I can see soil details, craters, etc. Then I move to 100km away from the moon. I look at the moon again and now I am unable to see the same level of details as before. Why?

Then comes the real question. We look at distant stars and planets using telescopes. I know that the current telescopes are not about enlarging pictures but gathering photons. But suppose we had at our disposal a telescope with infinite enlargement functionality. Would we be able to choose a planet, distant as hell and enlarge it to the point we could see its surface details or is detail lost forever as the light travels?

If it is lost, how it is lost?

# fantastic answers! THANKS!

In: Physics

4 Answers

Anonymous 0 Comments

Magnification aside, telescopes can see in greater detail because they have an ability that humans do not: they can take photographs with long exposures. The longer they stare at one object, the more light from that object they will be able to accumulate, eventually combining it into a single image formed from data gathered over a large amount of time. People can only ‘see’ things in the moment – they can’t save an image in their head until more light comes in and makes it sharper.

Light spreads out when it travels, and this is true even in a vacuum. Meaning, the further out from the source it travels, the weaker and weaker it gets. Light intensity actually drops off pretty rapidly the further out you get – just think how indescribably bright the sun is up close, and yet only 93 million miles away, and you can stare directly into it and only experience a little bit of discomfort.

Picturing a planet this way is impossible, for a few reasons.

1. It’s parent star would drown it out. Like, entirely.
2. Planets are moving. Constantly. They move in front of [too bright] their star and then behind it [blocked], meaning they wouldn’t stay still for the time needed to gather enough light to make anything out, even if we possible were able to.
3. If *that’s* not bad enough, they also rotate just like the Earth. So the period to capture an image before it’s a blur is down to minutes.
4. The very good point already made about the light coming from the planet/star being drowned out by the rest of the light coming from that direction as the intensity falls too low.

Anonymous 0 Comments

Your eye has a certain level of resolution. You can only see so much detail because, at further distances, the minutes of angle that the object takes is too small for your eye to resolve, and all you get is an approximation instead of the details.

A telescope is no different. If you are “enlarging” the image by collecting more photons, you’re collecting more photons from that object, sure, but you’re also collecting a lot more “garbage” at the same time.

Another way of looking at it is that, as distance increases, the photons reflected by the object spread out. Photons are only reflected so fast (in terms of photons/time, not speed of the photons), so eventually, the rate of photons coming to you from the object become a miniscule proportion of the number of total photons coming from that area. Therefore, your object gets drowned in the noise of other, random photons entering your aperture/lens

Anonymous 0 Comments

Interesting question. I’m not an expert here but light fascinates me. I think you need to look at this from a few different angles:

1. The further away two objects are, the more stuff between the two objects to scatter light, which obscures photons (detail) from reaching its destination. This can be dust and gas in space, an atmosphere, and even the lenses in the camera and your eye. Different elements also scatter light differently, which you can think of how Mars has a more reddish atmosphere and Earth has a more blueish atmosphere. This also creates the twinkle of the stars at night, and is a huge problem for telescopes on Earth.

2. The further away you are, the less photona of light reach you. Imagine a candle on a mountaintop. This candle, at each moment, is radiating a finite amount of light in all directions. Lets say its a billion photons per picosecond. If you are at the top of the mountain, you could enclose the candle with your hands and absorb all the photons. The further away you are, at the bottom of a mpuntain, you are only able to “catch” a smaller fraction of the total amount of photons emitted. Think of this as information loss–light is information. Now, this part gives me some chills to think about. Since planets themselves are not light sources, what you are seeing is how they reflect the light from their nearby star. So an observer on another mountaintop would have to be able to detect the light from the candle, bouncing off your hand at the bottom of the mountain, and back up to their eyeball. The amount of photons that make this round trip is absolutely insignificant compared yo the original light source.

3. The quality of the optics and the resolution of the sensor. Light bends whenever it travels through something. The more precise the optics, the more light passes through without distortion. The higher quality of the optics, the better it can be focused. The more perfectly the light is focused, the more detail can be discerned. Play with a camera and you can experience these first hand.

A combination of these and other factors, such as red shifting all contribute to providing a quality photo. The longer you can expose the sensor to a deep space object, the more photons it can capture. Stacking a thousand exposures together of a distant planet and using computer algorithms to blend then together might yield a good image, but there are still limits!

Anonymous 0 Comments

The limit of your eye’s resolution is usually given as an angle.

Picture your eye at the center of an angle, like the point of the symbol <. Your eye is at the point, and imagine there is an object at the end of each arm of the “<” symbol.

The resolution of your eye is the closest those two objects can be together for you to still see them as two distinct objects (<) and not one (slightly larger) object.

The thing with angles is, the further you are from the point of the angle (your eye), the further the two objects are. So without changing the angle, you may be able to distinguish two objects 1 inch apart if they’re 10 yards away from you, but if they’re 100 yards away the arms of the angle are spread out so you’ll only be able to distinguish two objects if they’re more than, say, 10 inches apart.

So this means that if you’re looking at a bunch of bananas 1 foot away, you can clearly distinguish each banana because they’re further away than that minimum distance. But if you’re looking at the same bunch of bananas 1000 feet away, they’re now too close together for you to distinguish each banana. If your friend was holding a banana in each hand with his arms stretched out, you could distinguish them but if they’re right next to each other you may not be able to tell one banana from the other (it just looks like one yellow lump).

So that’s how detail is lost: up close, you may be able to see each individual needle on the branch of a pine tree, because up close they don’t need to be far apart to be able to tell them apart. But if you’re looking at a pine tree in the distance, they are now too close together to distinguish each needle. It all looks like one green object.