# eli5: What happens when a live digital video feed is being transmitted to earth from a spacecraft traveling away?

143 views

For example, if I was on a rocket to mars and transmitting a video stream back to earth non-stop for the duration of the journey…

At the begining when I am leaving earth, the lag between me transmitting the video and earth recieving it would be basically nothing. But by the time I get to mars there is a delay of 3 minutes or whatever between send and recieve.

What happens to the video during the journey? Do some frames randomly drop? Does the video have to buffer constantly? Is there an incremental decrease in quality over time?

In: 2

Assuming you’re moving at a relatively constant speed relative to the earth, the information would essentially experience a Doppler effect.
The only effect this would have on the video is the “capture bitrate” on Earth, versus your “streaming bitrate”.
(As an example), if you were moving away at 1/1000th the speed of light, sending 1000 packets of information every second would allow the earth to capture 999 of them. None would be lost however, just more and more are “in transmission” at a time.
(Note: not actually sure of the proper math / formula).

>Is there an incremental decrease in quality over time?

That’s probably the closest of your descriptors to what actually happens.

Think of it in a relativistic sense. The rocket is transmitting constantly, so the data will be received constantly (ideally, ignoring the many other possible reasons why a signal might get lost).

But what will happen assuming that we’re talking about data being sent via radiowaves, is that the radiowaves will be [redshifted](https://en.wikipedia.org/wiki/Redshift) slightly due to the object’s motion. There’s also redshifting and blueshifting that happens up and down a gravity well; I don’t know which would be more dominant in this instance, but that’s how the distance would change the signal over time.

These redshifts are small. Very small.

Now, if this were, like, a two-way video call, it wouldn’t *per se* be a matter of *buffering*, it’d be a matter where the *actual time delay* would just slowly get bigger and bigger, day by day as the rocket went on. As in, by the time the rocket gets to Mars, the things Earth observers would be seeing on their screens would *actually* be the events going on ~20 minutes ago (20 minutes is the average time delay between Earth and Mars). Any reactions they had, would be to events going on 20 minutes ago, and Earth’s reactions, from Mars’ perspective, would all be reactions to events that took place 40 minutes ago.

So let’s say that you and a person on Mars agreed to meet up at the same time. You’d both walk up to the screen at the same time, but neither of you would see the other’s faces, until 20 minutes, because the “feed” would be running 20 minutes behind realtime, and you wouldn’t have any way to speed it up because literally the data hasn’t gotten there yet. So after sitting there for 20 minutes, the two of you would see each other walk up, 20 minutes “late”… except that you both know that’s just how time works. And maybe you’d both say “Hi!”… but all you’d actually see of each other, would be the 20 minutes you just spent sitting there waiting for the other person to walk up. 20 *more* minutes would pass, and then, finally, 40 minutes later, you’d see each other’s responses.

At this amount of time delay, you just can’t realistically have a real-time in-person conversation. It’d be way more efficient to just send short video messages and communicate that way.

Assuming you’re moving at a relatively constant speed relative to the earth, the information would essentially experience a Doppler effect.
The only effect this would have on the video is the “capture bitrate” on Earth, versus your “streaming bitrate”.
(As an example), if you were moving away at 1/1000th the speed of light, sending 1000 packets of information every second would allow the earth to capture 999 of them. None would be lost however, just more and more are “in transmission” at a time.
(Note: not actually sure of the proper math / formula).

>Is there an incremental decrease in quality over time?

That’s probably the closest of your descriptors to what actually happens.

Think of it in a relativistic sense. The rocket is transmitting constantly, so the data will be received constantly (ideally, ignoring the many other possible reasons why a signal might get lost).

But what will happen assuming that we’re talking about data being sent via radiowaves, is that the radiowaves will be [redshifted](https://en.wikipedia.org/wiki/Redshift) slightly due to the object’s motion. There’s also redshifting and blueshifting that happens up and down a gravity well; I don’t know which would be more dominant in this instance, but that’s how the distance would change the signal over time.

These redshifts are small. Very small.

Now, if this were, like, a two-way video call, it wouldn’t *per se* be a matter of *buffering*, it’d be a matter where the *actual time delay* would just slowly get bigger and bigger, day by day as the rocket went on. As in, by the time the rocket gets to Mars, the things Earth observers would be seeing on their screens would *actually* be the events going on ~20 minutes ago (20 minutes is the average time delay between Earth and Mars). Any reactions they had, would be to events going on 20 minutes ago, and Earth’s reactions, from Mars’ perspective, would all be reactions to events that took place 40 minutes ago.

So let’s say that you and a person on Mars agreed to meet up at the same time. You’d both walk up to the screen at the same time, but neither of you would see the other’s faces, until 20 minutes, because the “feed” would be running 20 minutes behind realtime, and you wouldn’t have any way to speed it up because literally the data hasn’t gotten there yet. So after sitting there for 20 minutes, the two of you would see each other walk up, 20 minutes “late”… except that you both know that’s just how time works. And maybe you’d both say “Hi!”… but all you’d actually see of each other, would be the 20 minutes you just spent sitting there waiting for the other person to walk up. 20 *more* minutes would pass, and then, finally, 40 minutes later, you’d see each other’s responses.

At this amount of time delay, you just can’t realistically have a real-time in-person conversation. It’d be way more efficient to just send short video messages and communicate that way.

The people on the ground would be watching a slightly slowed down version of the feed.

To illustrate why, suppose a spacecraft transmits video in pulses, i.e. it sends a frame worth of data every 24th of a second or whatever. It’s a model example that’s sufficient to demonstrate the idea

Each pulse has to travel slightly further than the one before, which means the time gap between consecutive frames sent is that 24th of a second gap *plus* the time it takes the pulse to travel the distance the spacecraft moved in that 24th of a second gap.

All of this means while the spacecraft sends out frames every 24th of a second, ground control *receives* those frames slightly more spaced out, which means the footage is received and played back at a slightly slower framerate. The discrepancy is tiny, because light is fast and spacecraft are slow, but its there.

>Is there an incremental decrease in quality over time?

That’s probably the closest of your descriptors to what actually happens.

Think of it in a relativistic sense. The rocket is transmitting constantly, so the data will be received constantly (ideally, ignoring the many other possible reasons why a signal might get lost).

But what will happen assuming that we’re talking about data being sent via radiowaves, is that the radiowaves will be [redshifted](https://en.wikipedia.org/wiki/Redshift) slightly due to the object’s motion. There’s also redshifting and blueshifting that happens up and down a gravity well; I don’t know which would be more dominant in this instance, but that’s how the distance would change the signal over time.

These redshifts are small. Very small.

Now, if this were, like, a two-way video call, it wouldn’t *per se* be a matter of *buffering*, it’d be a matter where the *actual time delay* would just slowly get bigger and bigger, day by day as the rocket went on. As in, by the time the rocket gets to Mars, the things Earth observers would be seeing on their screens would *actually* be the events going on ~20 minutes ago (20 minutes is the average time delay between Earth and Mars). Any reactions they had, would be to events going on 20 minutes ago, and Earth’s reactions, from Mars’ perspective, would all be reactions to events that took place 40 minutes ago.

So let’s say that you and a person on Mars agreed to meet up at the same time. You’d both walk up to the screen at the same time, but neither of you would see the other’s faces, until 20 minutes, because the “feed” would be running 20 minutes behind realtime, and you wouldn’t have any way to speed it up because literally the data hasn’t gotten there yet. So after sitting there for 20 minutes, the two of you would see each other walk up, 20 minutes “late”… except that you both know that’s just how time works. And maybe you’d both say “Hi!”… but all you’d actually see of each other, would be the 20 minutes you just spent sitting there waiting for the other person to walk up. 20 *more* minutes would pass, and then, finally, 40 minutes later, you’d see each other’s responses.

At this amount of time delay, you just can’t realistically have a real-time in-person conversation. It’d be way more efficient to just send short video messages and communicate that way.

Let’s say you start broadcasting on the launch pad and never stop until you get to mars, the receiver would receive the signal seamlessly. I will caveat, this excludes any orbital blackouts where the signal gets blocked, and ignores signal strengths falling off.

It takes the rocket longer to get to mars than the signal to get to earth. If you’re having a conversation, that is different because you are waiting for a response which means you have to wait for the message to get there and the messages to get back.

Yes there would be signal degradation over the distance. To what extent I don’t know off the top of my head. It depends what redundancy you have in the signals to make the full picture though.

The people on the ground would be watching a slightly slowed down version of the feed.

To illustrate why, suppose a spacecraft transmits video in pulses, i.e. it sends a frame worth of data every 24th of a second or whatever. It’s a model example that’s sufficient to demonstrate the idea

Each pulse has to travel slightly further than the one before, which means the time gap between consecutive frames sent is that 24th of a second gap *plus* the time it takes the pulse to travel the distance the spacecraft moved in that 24th of a second gap.

All of this means while the spacecraft sends out frames every 24th of a second, ground control *receives* those frames slightly more spaced out, which means the footage is received and played back at a slightly slower framerate. The discrepancy is tiny, because light is fast and spacecraft are slow, but its there.

In practical terms we will never have to deal with this, because the speeds at which we travel will never come close to exhausting a buffer. Mars is about 20 light-minutes away at its closest, but it takes four or five months to get there, so the delay is so gradual you won’t have any technical problems even if the stream is nonstop the whole way (it would start “live” and end up delayed by 20 minutes). We might experience this in the coming years as Artemis missions may well include continuous live-streams, but that will be three days to get to 1.5 seconds of delay.

If you have a hypothetical ship that could accelerate to significant fractions of light-speed right out of the gate, then you would have a buffering problem, but you’d have to be going so fast that the distance you travel between frames is more than the refresh rate of the stream. I don’t think I can do the math to figure that out.