eli5 how do LLM models of the size of few GBs have information about almost everything?

257 views

I tried runing llma2 on my local machine, the size of the model is roughly 4 gbs, and it runs offline.

It has so far answered all questions I asked, about diverse topics in computer science, literature, philosophy and biology. How is so much information stored in such a small size?

In: 8

4 Answers

Anonymous 0 Comments

>I tried runing llma2 on my local machine, the size of the model is roughly 4 gbs, and it runs offline.

>It has so far answered all questions I asked, about diverse topics in computer science, literature, philosophy and biology. How is so much information stored in such a small size?

Now other answers have tried to explain the nature of LLMs but I think the most crucial thing here is that in an age of 80GB bluray rips and 50GB game patches it can be easy to lose track of **how much freaking data** 4GB actually are.
The entire text of the English Wikipedia is about 13GB, for comparison.

You are viewing 1 out of 4 answers, click here to view all answers.