why do models like ChatGPT forget things during conversations or make things up that are not true?

585 views

why do models like ChatGPT forget things during conversations or make things up that are not true?

In: 803

23 Answers

Anonymous 0 Comments

You’re getting loads of opinionated answers, and many people claiming what is to “think” or not, which becomes very philosophical and also not suitable for an ELI5 explanation I think.

To answer your question, chatGPT repeats what it learned from reading loads of sources (internet and books, etc), so it’ll repeat what is most likely to appear as the answer to your question. If a wrong answer is repeated many times, chatGPT will consider it as the right answer, so in that case it’d be wrong.

You are viewing 1 out of 23 answers, click here to view all answers.