>I tried runing llma2 on my local machine, the size of the model is roughly 4 gbs, and it runs offline.
>It has so far answered all questions I asked, about diverse topics in computer science, literature, philosophy and biology. How is so much information stored in such a small size?
Now other answers have tried to explain the nature of LLMs but I think the most crucial thing here is that in an age of 80GB bluray rips and 50GB game patches it can be easy to lose track of **how much freaking data** 4GB actually are.
The entire text of the English Wikipedia is about 13GB, for comparison.
Latest Answers