eli5: What happens when a live digital video feed is being transmitted to earth from a spacecraft traveling away?

450 views

For example, if I was on a rocket to mars and transmitting a video stream back to earth non-stop for the duration of the journey…

At the begining when I am leaving earth, the lag between me transmitting the video and earth recieving it would be basically nothing. But by the time I get to mars there is a delay of 3 minutes or whatever between send and recieve.

What happens to the video during the journey? Do some frames randomly drop? Does the video have to buffer constantly? Is there an incremental decrease in quality over time?

In: 2

18 Answers

Anonymous 0 Comments

Let’s say you start broadcasting on the launch pad and never stop until you get to mars, the receiver would receive the signal seamlessly. I will caveat, this excludes any orbital blackouts where the signal gets blocked, and ignores signal strengths falling off.

It takes the rocket longer to get to mars than the signal to get to earth. If you’re having a conversation, that is different because you are waiting for a response which means you have to wait for the message to get there and the messages to get back.

Yes there would be signal degradation over the distance. To what extent I don’t know off the top of my head. It depends what redundancy you have in the signals to make the full picture though.

Anonymous 0 Comments

Assuming you’re moving at a relatively constant speed relative to the earth, the information would essentially experience a Doppler effect.
The only effect this would have on the video is the “capture bitrate” on Earth, versus your “streaming bitrate”.
(As an example), if you were moving away at 1/1000th the speed of light, sending 1000 packets of information every second would allow the earth to capture 999 of them. None would be lost however, just more and more are “in transmission” at a time.
(Note: not actually sure of the proper math / formula).

Anonymous 0 Comments

In practical terms we will never have to deal with this, because the speeds at which we travel will never come close to exhausting a buffer. Mars is about 20 light-minutes away at its closest, but it takes four or five months to get there, so the delay is so gradual you won’t have any technical problems even if the stream is nonstop the whole way (it would start “live” and end up delayed by 20 minutes). We might experience this in the coming years as Artemis missions may well include continuous live-streams, but that will be three days to get to 1.5 seconds of delay.

If you have a hypothetical ship that could accelerate to significant fractions of light-speed right out of the gate, then you would have a buffering problem, but you’d have to be going so fast that the distance you travel between frames is more than the refresh rate of the stream. I don’t think I can do the math to figure that out.

Anonymous 0 Comments

In practical terms we will never have to deal with this, because the speeds at which we travel will never come close to exhausting a buffer. Mars is about 20 light-minutes away at its closest, but it takes four or five months to get there, so the delay is so gradual you won’t have any technical problems even if the stream is nonstop the whole way (it would start “live” and end up delayed by 20 minutes). We might experience this in the coming years as Artemis missions may well include continuous live-streams, but that will be three days to get to 1.5 seconds of delay.

If you have a hypothetical ship that could accelerate to significant fractions of light-speed right out of the gate, then you would have a buffering problem, but you’d have to be going so fast that the distance you travel between frames is more than the refresh rate of the stream. I don’t think I can do the math to figure that out.

Anonymous 0 Comments

Let’s say you start broadcasting on the launch pad and never stop until you get to mars, the receiver would receive the signal seamlessly. I will caveat, this excludes any orbital blackouts where the signal gets blocked, and ignores signal strengths falling off.

It takes the rocket longer to get to mars than the signal to get to earth. If you’re having a conversation, that is different because you are waiting for a response which means you have to wait for the message to get there and the messages to get back.

Yes there would be signal degradation over the distance. To what extent I don’t know off the top of my head. It depends what redundancy you have in the signals to make the full picture though.

Anonymous 0 Comments

Moving away will result in an apparent reduction in frame rate. The faster you go the greater the reduction. The reduction is relative to the speed of the ship. At typical human speeds the frame rate will probably be a reduction of a fraction of a frame per second. You wouldn’t really be able to notice it if the video were continuously streamed. The reduction in the frame rate would make it so that the cumulative delay could be significant so that what you are watching happened minutes to hours before (depending how far you are from the Earth). Each change in time was so small you wouldn’t notice it. It’d be like watching the hour hand of an analog clock. It doesn’t move if you watch it, but if you look at it an hour later it’s moved 1/12 of a revolution.

Anonymous 0 Comments

Moving away will result in an apparent reduction in frame rate. The faster you go the greater the reduction. The reduction is relative to the speed of the ship. At typical human speeds the frame rate will probably be a reduction of a fraction of a frame per second. You wouldn’t really be able to notice it if the video were continuously streamed. The reduction in the frame rate would make it so that the cumulative delay could be significant so that what you are watching happened minutes to hours before (depending how far you are from the Earth). Each change in time was so small you wouldn’t notice it. It’d be like watching the hour hand of an analog clock. It doesn’t move if you watch it, but if you look at it an hour later it’s moved 1/12 of a revolution.

Anonymous 0 Comments

Moving away will result in an apparent reduction in frame rate. The faster you go the greater the reduction. The reduction is relative to the speed of the ship. At typical human speeds the frame rate will probably be a reduction of a fraction of a frame per second. You wouldn’t really be able to notice it if the video were continuously streamed. The reduction in the frame rate would make it so that the cumulative delay could be significant so that what you are watching happened minutes to hours before (depending how far you are from the Earth). Each change in time was so small you wouldn’t notice it. It’d be like watching the hour hand of an analog clock. It doesn’t move if you watch it, but if you look at it an hour later it’s moved 1/12 of a revolution.