It’s a *language* model. It stores information about how words are typically arranged in sentences, and what sort of sentences would follow a given question/prompt.
If you ask it a question about biology, it doesn’t understand what your question means or what its response means. It just understands how people on the internet typically respond to a question like that.
And you actually *could* fit a boatload of knowledge in a few GB if you are just storing text, no photos or videos. But it doesn’t really keep a record of everything it was trained on. It just “remembers” the typical patterns and arrangements of words in the model. This is how it can come up with new responses or ideas that *potentially* nobody has ever said before, but this isn’t because it understands new ideas or is actually creative. It’s just spitting out words in a way similar to how humans write words which can be convincing and sometimes right, but is very often wrong.
Latest Answers