[ad_1]

I know that it takes <0.1 seconds for light to go across the ocean and back, but how quickly are the messages sent? How can long documents and videos be compressed into flashes that last ridiculously short? How many flashes per millisecond happen?

In: Engineering

[ad_2]

Light moves through fiber optic cable around 30% slower than the speed of light.

In terms of data rate, the fastest speeds get up to 10 Gbps. That’s 10 bits of data every nanosecond. The average person that has fiber internet might get 1 Gbps. So 1 but per nanosecond. Computers have been able to process data 2-3 times faster than that for a while now.

Data coming in is just a sequence of bits, each bit has a value of either 1 or 0. Regular wire does this by changing voltage, 0 V corresponding to a 0 and some higher voltage level for 1. Fiber optic does this by flashing light. No light means a 0, while some amount of light means a 1. All you need is a sensor to read this input, and these sensors can detect just as fast as they are sent

You may have also heard of a Byte, which is just 8 bits.

The upper theoretical limit is about 1 petabit per second, which implies a pulse rate of 15 petaHz or a pulse length of 1 femtosecond. REALLY fast.

It means the length of the pulse is incredibly small, about 0.000008 inches.

In practice you don’t go that fast, but it’s still really fast. Real cables going tens of Gb per second are perfectly normal.