When you ask ChatGPT something it doesn’t just come up with the one single answer. It’s more like:
You: “Who is the president of the USA?”
ChatGPT: (magic)
ChatGPT: (90% Biden, 9% Trump, 1% Obama, fuck it, I’ll go with Biden)
ChatGPT: “Biden”
Shit happens when instead of 90% vs 9% the question is more complex and ChatGPT’s magic only allows it to find answers with low certainty. It will still chose the most likely one but if that happens to have a 1% change of being right, it will most likely be wrong.
Latest Answers