ChatGPT is not a general intelligence. For example, Jarvis in Ironman. We’re decades away.
It is a large language model. It collects and studies which words usually show up with one another and use statistics to predict the words that should come after the ones you provide. In other words, it’s similar to a fortune teller who uses your words and verbal languages to pretend to have psychic power.
If you give it common words that the internet likes to lie and joke about, it’ll give you cringy answers that people share on tiktok.
ChatGPT is not a general intelligence. For example, Jarvis in Ironman. We’re decades away.
It is a large language model. It collects and studies which words usually show up with one another and use statistics to predict the words that should come after the ones you provide. In other words, it’s similar to a fortune teller who uses your words and verbal languages to pretend to have psychic power.
If you give it common words that the internet likes to lie and joke about, it’ll give you cringy answers that people share on tiktok.
ChatGPT is not a general intelligence. For example, Jarvis in Ironman. We’re decades away.
It is a large language model. It collects and studies which words usually show up with one another and use statistics to predict the words that should come after the ones you provide. In other words, it’s similar to a fortune teller who uses your words and verbal languages to pretend to have psychic power.
If you give it common words that the internet likes to lie and joke about, it’ll give you cringy answers that people share on tiktok.
It doesn’t know what a real/fake reference is. It knows how to string words together into something that sounds like an argument, and it knows that when other people write arguments, they put a string of names/dates/titles/publishers at the end in APA format, but it doesn’t understand the connection between those things.
GPT is a baby. Beyond the superficial level, in a very real sense, it doesn’t know what it’s doing. It’s just good at mimicking people who do.
If I can anthropomorphize a bit, it doesn’t intend to lie or cheat here. In a few years it might understand those concepts but right now, it’s trying to play a game without knowing most of the rules. When my friend taught me basketball I committed double-dribbling, up-and-down and travelling multiple times in the first minute. It’s not that I set out to be a cheat, I was just trying to do what I’d seen basketball players do, without knowing any of the nuances.
Latest Answers