How GPT solve logic and math problems

299 views

My very limited understanding of GPT is that it’s basically a text generator. Why and how could it solve logic and math problems? Or is it just an emergent ability from LLM that nobody understands?

In: 4

14 Answers

Anonymous 0 Comments

“Sparks of AGI” is a lecture on YouTube I have been forcing everyone I know to watch. GPT4 is, or rather was, intelligent by nearly every metric we have. It predicts the next word but to do that extremely well it had to build an internal model of the world. For real the unicorn segment gives me shivers just thinking about it

Anonymous 0 Comments

“Sparks of AGI” is a lecture on YouTube I have been forcing everyone I know to watch. GPT4 is, or rather was, intelligent by nearly every metric we have. It predicts the next word but to do that extremely well it had to build an internal model of the world. For real the unicorn segment gives me shivers just thinking about it

Anonymous 0 Comments

It can’t.

It’s just a really advanced version of auto-prediction. Kinda like how your phone might predict the next word you want to write based on your typing history.

Expect it can predict entire sentences and paragraphs. Because it’s more complicated than that and has loads of text history to reference.

(Okay, because I already know some people will not-uh that statement. This is a really bad explanation of what it is and does. I’m just trying to water it down so it’s ELI5 friendly.)

You can ask it a math problem, and it might be able to answer it well enough. Or maybe give it a logical statement, and it might appear to understand it.

But that’s just because it’s seen something similar before, so it already had the answer readily available.

You can try asking it questions that it’s not familiar with, and you’ll quickly see how incapable it is and that it’s quite limited.

Try asking it to play Four Fours. It is a very simple and easy math game, but gpt is terrible at it. It can’t very well generate new answers to questions it’s never seen the answer to.

Or you can ask it to explain this scenario in detail:

Three men go into a bar. The bartender asks, ‘Would you all like a beer?’ The first man say, ‘I don’t know.’ The second says, ‘I don’t know.’ The last man says, ‘Yes.’

That’s a scenario with simple logic to understand. But again, if gpt hasn’t seen it before, it can’t really understand it and will reply with seemingly irrelevant explanations.

Anonymous 0 Comments

It can’t.

It’s just a really advanced version of auto-prediction. Kinda like how your phone might predict the next word you want to write based on your typing history.

Expect it can predict entire sentences and paragraphs. Because it’s more complicated than that and has loads of text history to reference.

(Okay, because I already know some people will not-uh that statement. This is a really bad explanation of what it is and does. I’m just trying to water it down so it’s ELI5 friendly.)

You can ask it a math problem, and it might be able to answer it well enough. Or maybe give it a logical statement, and it might appear to understand it.

But that’s just because it’s seen something similar before, so it already had the answer readily available.

You can try asking it questions that it’s not familiar with, and you’ll quickly see how incapable it is and that it’s quite limited.

Try asking it to play Four Fours. It is a very simple and easy math game, but gpt is terrible at it. It can’t very well generate new answers to questions it’s never seen the answer to.

Or you can ask it to explain this scenario in detail:

Three men go into a bar. The bartender asks, ‘Would you all like a beer?’ The first man say, ‘I don’t know.’ The second says, ‘I don’t know.’ The last man says, ‘Yes.’

That’s a scenario with simple logic to understand. But again, if gpt hasn’t seen it before, it can’t really understand it and will reply with seemingly irrelevant explanations.