What’s stopping mathematicians from defining a number for 1 ÷ 0, like what they did with √-1?

803 viewsMathematicsOther

What’s stopping mathematicians from defining a number for 1 ÷ 0, like what they did with √-1?

In: Mathematics

21 Answers

Anonymous 0 Comments

Mathematicians have done exactly this, it’s just rare to encounter it outside of specialized fields.

In math, we set up a system of rules (formally, “axioms”), then work out the consequences. Importantly, there is no one best/correct set of rules to start from. But we prefer the simplest set of rules that let us handle whatever scenario we are thinking about, and prefer rules that lead to fewer inconsistencies.

Thus, when you are just getting started learning about roots and exponents we say “negative numbers don’t have square roots” and “you cannot divide a number by zero.” But then we get to quadratic equations and realize that being able to assign some value to the square root of a negative number would be convenient. So we come up with some new rules that let us do that, while still keep inconsistencies to a minimum.

We can do the same for 1/0, but the scenarios where we want to do so are rarer so you may not have run into it. And there in fact several different choices of rules that get made depending on exactly what we want to do.

In floating point math (what computers mostly use), we invent the “Not a Number” value and say that is what 1/0 is. In other contexts, we might say 1/0 “equals” positive infinity, and -1/0 is negative infinity. And in still other settings we invent hyperreal numbers which (sort of) give a value to 1/0 (https://en.wikipedia.org/wiki/Hyperreal_number).

You are viewing 1 out of 21 answers, click here to view all answers.