Data speeds are theoretical. Guaranteed speeds tend to encompass the overhead losses.
Data is still physical electrons moving through read life mediums. The variables involved in this happening are endless. There’s processing delays, wire travel delays (fiber delays as well; the speed of light isn’t instantaneous), and assorted bottle necks.
Think of it like a car engine. The engine may produce 200 horsepower at the back of the engine; but when you attach a transmission, drive line, differential, axles, and tires, the horsepower hitting the pavement may only be 170 or less.
Every system has inefficiencies and losses. And the harder you push a system, the less inefficient they become. You COULD get that 1000mbps, but the system would need to be designed for 2000mbps or more and only let you use the super low loss region. And that would have a HUGE infrastructure expense that no company is going to take on. So your data gets crowded together with everyone else’s which adds losses (routing, processing, etc, etc, etc).
Everyone says that LEDs are super efficient. Sure, absolutely. At a certain range they convert nearly all of the power coming into them into light. But if you want a brighter light, you either add more diodes (expensive), or throw more power at the existing diode and lose some energy as a heat-generating inefficiency (cheap).
Advertised data speeds are usually arrived at using a math formula that assumes perfection in the system.
When you send a 1lb package, you typically have to box it up, and package it. You have to slap a label on it. If you mail someone a 5lb weight, it’s going to weigh more than 5 lbs because of the box and the packaging, the labels and ink.
Computers need to know where to send the data, and to get relevant data back they need a return address.
The packaging is the 10% of weight you’re missing.
Latest Answers