if numbers are finite, why do we use /100 or 100% to measure limit?

129 views

[ad_1]

edit: if numbers are *INFINITE*. sorry, my thumb slipped

In: Mathematics
[ad_2]

1. Numbers are infinite. Only each individual number is finite, but in general, there are infinitely many numbers.
2. Percentages are neat. If you have 4623765872675632 apples, out of which 65463875682 are spoilt, is that a lot of the apples?Answer: It’s about 0.001%, so roughly one in a hundredthousand apples.

Because we have a base 10 numbering system, so multiples of 10 are generally easiest to use. Since multiples of 10 are so easy, some people in ancient rome used fractions of 100 quite often before there was a decimal system. They called it in latin “per centum” meaning “for each hundred,” and we kept that around, but shortened it to percent.

We don’t only use #/100 we also use #/10 or #/1000 etc, depending on how accurate you need to be.

Of course, there’s no reason you couldn’t use any number. 27/27 is equal to 100/100. (read as 27 divided by 27, 100 divided by 100) They both equal 1.

But now that we have a decimal system, you can use 0.## to represent a portion of one whole, with the whole being “1” of course.

I’m not really sure what you mean by your question, as numbers are actually infinite.

The percentage notation is mostly there for readability, as two digits of precision is a good balance between enough details and not too much details. It also serves to express that your are discussing an amount in proportion of some other amount, instead as in absolute.

For example, “the amount I have is 0.5 times the amount I need” is shortened as “I have 50% of the amount I need”. Both mean the exact same thing, using percentage is a specific wording so can be more concise.

There’s a few ways to look at this.

First is rounding. As in why measure to 1% and not 0.0000001%, or some other figure.
For most applications it just doesn’t matter. The accuracy is irrelevant. Even if you go to two decimal places (in %) that’s going to be enough for most engineering activities. You’d need to get down to really specific things like rocket, plane or maybe massive infrastructure designs where accuracy beyond a few decimal points of a percent actually makes a difference.

The next way to interpret your question is why is percent out of 100, and not some arbitrarly large figure since, as your edit says, numbers are infinite.
We work on a base 10 system. So using percent is easy. Very easy for quick mental maths. Doesn’t matter much to computers I guess, but by the time computers came about I’d guess we were already using percent anyway.

As /u/PM_ME_A_PLANE_TICKET said, it’s because it’s easy.

You can form a mental model around 100. Whereas trying to create one around random numbers is more difficult.

Let’s say for example flipping a coin. When I ask “you which side do you think will come up?” Your mind subconsciously makes a probability model. You can express that model in a couple ways. 1 out of 2. Half. 50%.

Now work in reverse. If I’m trying to tell you the result of a mental model, if i tell you it happens 1/33 times or i tell you it happens 3%, which is easier?

By making it a % it puts it in a frame of reference that you can understand, easily.