How do file hosts control(?) the speed at which your file gets downloaded?

128 views

Just curious how do file hosts (e.g. Rapidgator) control the speed at which you are able to download a file?

For example, if you choose a free option, you download the file at X hours. However when you’re a Premium user, the file is downloaded 10x faster.

Like it’s the same file with the same file size, right? What do they do so that there will be difference at which a user will be able to download a file depending on their subscription? If it’s feasible, would also like a visualization of what happens when your browser downloads via free vs. via premium account.

Thank you!

Edit: grammar

In: 18

4 Answers

Anonymous 0 Comments

A file is stored somewhere on disk. To send a file to the destination the code enters a loop: read some data, send the data to the destination. Rinse and repeat until done.

You can easily measure how fast the file is being transferred. You know that you started 10 seconds ago and transferred 10 MB in those 10 seconds. Therefore the current speed is 1MB/s. If that’s too much, simply insert a pause in the process. Do nothing for a while, until the average speed falls down enough.

Such a decision can be easily made based on account type, time of the day, how many other people are downloading, etc.

Anonymous 0 Comments

You can use hardware (routers) or software (programs on a storage server) or a combination of both to set limits to transfer speeds. This is called Traffic Shaping. Think of each connected PC as having its own tap to turn on the flow of traffic. The server can control how much each tap is opened.

Anonymous 0 Comments

You have 2 connections to the internet. A download connection and an upload connection. When you download a file from someone, the other side is uploading it to you.

When you click the download button, the website sends the request to a webserver who begins to upload to you. With that request, it includes additional information like the account type and premium status. The webserver takes that information and uses it when it serves the file to you. If your a premium user, it will try and serve the data as fast as it can. If your not a premium user, it decides to only send you 250 Kilobytes of data, before waiting a second to send another 250 kilobytes. If your an ad supported user, it will send 500 kilobytes before waiting a second.

Its not that the file is the same size, its the webserver is purposefully closing it’s upload valve to force to upgrade because money.

Anonymous 0 Comments

The world wide web is built upon two major protocols: Transmission Control Protocol (TCP), and Internet Protocol (IP). Or collectively referred to as TCP/IP. These protocols determine how data (the *payload*) is encapsulated for sending over a network, including things like adding *headers* that help get the traffic to its destination, and replies to be sent back.

Under TCP/IP, any data exchanged between two devices is send using packets. The postal system is a popular analogy for computer networks, it’s literally influenced how they’re designed; in this analogy, the payload is the letter you want to send, and the TCP/IP packet is the envelope you put it in. The size of the chunks is determined by the Maximum Transmission Unit (MTU), which can be configured in the network hardware. An MTU of around 1500 bytes is common. This means that no individual TCP/IP packet will be larger than 1500 bytes.

But pretty much anything you want to download today is much larger than 1500 bytes. So larger files are broken up and sent as a stream of multiple packets, which are reassembled at the receiving end. Like if your letter has too many pages, you might have to send separate envelopes. TCP records sequence numbers so you can put them in the right order and ensure no packets get lost.

A network connection generally cannot control how fast a packet is sent. Electricity goes as fast as the copper wire allows. Light travels as fast as the fibre optic cable allows. A router or switch will route packets to their destination as fast as its hardware allows, though they can prioritise certain types of traffic over others. But overall these speeds are pretty much fixed.

But what a web server *can* do is control how quickly it sends each subsequent packet. So a file hosting service might choose to limit how quickly it streams the packets to any given user. It can easily incorporate a system that sends packets faster if a user pays for a premium account. Just like if you’re sending a series of letters, you don’t really have any control over how quickly the mail truck picks it up, or the processing centre sorts it, or the mailman delivers it. But you can certainly wait a few days before posting the next letter in the sequence.