2 squared is 4, which is more than 2. However, 0.2 squared is 0.04, which is less than 0.2. Can someone explain this apparent contradiction where a number between 0 and 1 squared becomes less than the original number but everything else (more than 1) becomes more?

415 views

And I understand for example if I’m talking about meters, it’ll become square meters so the comparison is not apples to apples anymore. But in situations where there is no unit (for example, a math equation where you need to find x, whatever X is), why is this not contradictory?

In: 0

13 Answers

Anonymous 0 Comments

When you square something, it’s shorthand for saying “MULTIPLY IT BY ITSELF”. So think that way.

2 squared is 2 multiplied by 2, that’s “two twos”. 3 squared is 3 multiplied by 3, that’s “three threes”.

***When you multiply any positive number by something bigger than 1, you get a larger number.*** So squaring any number bigger than 1 results in a larger number.

Now let’s look at 0.2. It’s actually one-fifth. When you square it, you’re saying 0.2 * 0.2. That’s the same as saying “one fifth of one fifth”.

A fifth of *anything* is SMALLER than that thing. And that’s what you’re doing.

How about 0.5? That’s actually one half. Why you square it, you’re saying 0.5 * 0.5 or “one half of one half”. Once again, half of anything is SMALLER than that thing.

***When you multiply any positive number by something between 0 and 1, you get a smaller number.*** So squaring any number between 0 and 1 results in a smaller number.

You are viewing 1 out of 13 answers, click here to view all answers.