There are limits of lossless compress that I would call mathematical not physical limits. It is describe in Shannon’s source coding theorem that uses the concept of Shannon entropy also called Information entropy.

It says that truly random data can’t be compressed and the closer to random it is the less you can compress it. There is a lot of random data in audio and video so you cant compress the a lot in a lossless way.

Compression of programs and document on a computer lossless might result in lets say 50-5% of the original size. A test on the portable apps in my dropbox the .exe and .dll resulted in compression to 25% of the original size so 4x compression with 7zip

Lossless compression of audio result in a file that is compressed to approximate 60% but lets call it 2x compression. You would get more in video.

The later video format like x265 compress the video had the design goal if 1000x compression that is may times more then any loseless could ever achieve.

A x265 file i look at is 271 MB where the video was 202MB and contain 79 minutes of video at 1280×720 ad 23.976 fps and 4:2:0 Chroma subsampling so 12 bits per pixel.

That is 79*60*1280*720*12/8*23.976=157104562176 bytes or 149826 MB or 146 gigabytes of data. So the compression ratio is 149826/202=741 or 741x

So video compression in my text compresses the data 185x more then compression of computer programs resulted in. So we use lossy compression because it is simpler to do and is over 100 times more efficient.

## Latest Answers