Why are Web technologies trying to compress data better if internet speeds are becoming faster?

68 views

Why are Web technologies trying to compress data better if internet speeds are becoming faster?

In: 0

Data compression = good for hardware. You can download a game super fast but if your computer can’t handle that much data at once, it’ll still take 3-5 business days to open the damn thing

Internet may be getting faster, but it’s not getting faster for everyone *equally*. I’d kill to get decently priced gigabit, but I’m already paying more than I like 300 megabit. Even if you can pay, it’s up to your ISP to provide that service to your area, and they may not be willing to pay to upgrade the infrastructure.

On the flip side, we’re sending a lot more data than before. 4k video have become more common over the past 5 years, and same for higher fidelity audio. Triple-A games can easily crack half a terabyte in size due to massive open worlds, high rez textures, and more.

We need more efficient ways of sending increasingly bigger blobs of data, hence the focus on compression.

Why would you not try to do that? It is not like is is one or the other you can do both. They make the products you have better even with the same internet connection. Compressing data is cheaper than skipping the and getting a faster internet connection

More resources means lazier programming. In the 80’s you could buy a Sinclair Spectrum with 16k or 48k of RAM. In 16k you could play an entire game that could last hours. Now you can’t even save a Word document with no content that would fit in 16k.

With limited resources programmers came up with ingenious ways of fitting as much as possible into the small space they had. As resources grew so did the size of programs/games, now a game will run to several gigabytes to download.

Why this is relevant to the Web? You only have to look at how Web pages used to look compared to the graphic and functionality rich sites we have today. I worked for a Web company and zero optimisation was done on images unless complaints were made. It was just designed for a 56k modem, ADSL etc. As speeds increased demands also increased. Now customers want the most feature rich experience for customers (understandably) but customers won’t accept slow sites. So compressions are still actively worked on to counteract this so people on slower connections can still get a reasonable experience. Browser manufacturers also remain in competition with each other (particularly Mozilla Firefox and Google Chrome) to provide the best experience, often end users will put more blame on their browsers rather than their slow connections, so anything they can do to counteract slow connections helps with that perception as usually processing power at the computer end is not an issue as much as connection speed is.

There is also the issue that in many cases companies pay for their website bandwidth by metered connections per megabyte/gigabyte so the more data can be compressed the less they need to pay.

It’s true that internet speeds are getting faster but apps are also growing in size.

The more you can compress your app, the more features you can add to your app.

“Information Theory” is the science that guides our understanding of digital communications and storage.

Put simply, advances in information theory allow us to do two things: send bits faster (or store them denser), and say more useful pieces of information with fewer bits (compression).

Without working on this problem from both ends there’s no way we’d be able to make internet media work as well as it does. It gets even more dire when you remove wires from the equation.

Edit: I forgot to mention the most important part about compression. It makes the internet cheaper to operate for the same user experience. In engineering, decisions almost always come down to cost!

0 views

Why are Web technologies trying to compress data better if internet speeds are becoming faster?

In: 0

Data compression = good for hardware. You can download a game super fast but if your computer can’t handle that much data at once, it’ll still take 3-5 business days to open the damn thing

Internet may be getting faster, but it’s not getting faster for everyone *equally*. I’d kill to get decently priced gigabit, but I’m already paying more than I like 300 megabit. Even if you can pay, it’s up to your ISP to provide that service to your area, and they may not be willing to pay to upgrade the infrastructure.

On the flip side, we’re sending a lot more data than before. 4k video have become more common over the past 5 years, and same for higher fidelity audio. Triple-A games can easily crack half a terabyte in size due to massive open worlds, high rez textures, and more.

We need more efficient ways of sending increasingly bigger blobs of data, hence the focus on compression.

Why would you not try to do that? It is not like is is one or the other you can do both. They make the products you have better even with the same internet connection. Compressing data is cheaper than skipping the and getting a faster internet connection

More resources means lazier programming. In the 80’s you could buy a Sinclair Spectrum with 16k or 48k of RAM. In 16k you could play an entire game that could last hours. Now you can’t even save a Word document with no content that would fit in 16k.

With limited resources programmers came up with ingenious ways of fitting as much as possible into the small space they had. As resources grew so did the size of programs/games, now a game will run to several gigabytes to download.

Why this is relevant to the Web? You only have to look at how Web pages used to look compared to the graphic and functionality rich sites we have today. I worked for a Web company and zero optimisation was done on images unless complaints were made. It was just designed for a 56k modem, ADSL etc. As speeds increased demands also increased. Now customers want the most feature rich experience for customers (understandably) but customers won’t accept slow sites. So compressions are still actively worked on to counteract this so people on slower connections can still get a reasonable experience. Browser manufacturers also remain in competition with each other (particularly Mozilla Firefox and Google Chrome) to provide the best experience, often end users will put more blame on their browsers rather than their slow connections, so anything they can do to counteract slow connections helps with that perception as usually processing power at the computer end is not an issue as much as connection speed is.

There is also the issue that in many cases companies pay for their website bandwidth by metered connections per megabyte/gigabyte so the more data can be compressed the less they need to pay.

It’s true that internet speeds are getting faster but apps are also growing in size.

The more you can compress your app, the more features you can add to your app.

“Information Theory” is the science that guides our understanding of digital communications and storage.

Put simply, advances in information theory allow us to do two things: send bits faster (or store them denser), and say more useful pieces of information with fewer bits (compression).

Without working on this problem from both ends there’s no way we’d be able to make internet media work as well as it does. It gets even more dire when you remove wires from the equation.

Edit: I forgot to mention the most important part about compression. It makes the internet cheaper to operate for the same user experience. In engineering, decisions almost always come down to cost!