You’re getting loads of opinionated answers, and many people claiming what is to “think” or not, which becomes very philosophical and also not suitable for an ELI5 explanation I think.
To answer your question, chatGPT repeats what it learned from reading loads of sources (internet and books, etc), so it’ll repeat what is most likely to appear as the answer to your question. If a wrong answer is repeated many times, chatGPT will consider it as the right answer, so in that case it’d be wrong.
Latest Answers