eli5 how do LLM models of the size of few GBs have information about almost everything?

263 views

I tried runing llma2 on my local machine, the size of the model is roughly 4 gbs, and it runs offline.

It has so far answered all questions I asked, about diverse topics in computer science, literature, philosophy and biology. How is so much information stored in such a small size?

In: 8

4 Answers

Anonymous 0 Comments

It doesn’t have information, it just knows what sentences look like. It seems to give you answers because it makes good sentences. Alas, a lot of the time those sentences are false. I asked ChatGPT last week what are the odds of 90 heads out of 100 flips, and it was off by a mile but the punctuation was perfect.

You are viewing 1 out of 4 answers, click here to view all answers.