eli5 : Why does ai like ChatGPT or Llama 3 make things up and fabricate answers?

1.54K viewsOtherTechnology

I asked it for a list of restaurants in my area using google maps and it said there is a restaurant (Mug and Bean) in my area and even used a real address but this restaurant is not in my town. Its only in a neighboring town with a different street address

In: Technology

22 Answers

Anonymous 0 Comments

Because while these devices are trained on internet data the internet itself is full of a lot of misinformation and on top of that learning algorithms are still kind of in their infancy, despite how promising they are they certainly still have a lot of issues that can result in a lot of false positives.

Although on the other hand in my experience chat GPT and similar AIs do not seem to lie nearly as much as some news articles claim.

Even tested this recently when they were news articles claiming that like chat GPT almost always gets math wrong, started asking it math questions of various degrees of hardness and it only really ended up getting one kind of wrong and it was due to a mistake that I could see even a human making

And if I just usually ask it something basic like who was the 30th president of the USA or something it usually gets that right. It typically just seems to have issues with more logic related questions because the AI itself is not really designed to be logical it’s designed to be conversational

You are viewing 1 out of 22 answers, click here to view all answers.