why do models like ChatGPT forget things during conversations or make things up that are not true?

609 views

why do models like ChatGPT forget things during conversations or make things up that are not true?

In: 803

23 Answers

Anonymous 0 Comments

Why does your Scarlet Macaw seem to constantly lose the thread or your conversation? Because it’s just parroting back what it’s learned.

Language models have read an uncountable number of human conversations. They know what words commonly associate with what responses. They understand none of them.

Language models are trained parrots performing the trick of appearing to be human in their responses. They don’t care about truth, or accuracy, or meaning. They just want the cracker.

You are viewing 1 out of 23 answers, click here to view all answers.