Why is it mathematically consistent to allow imaginary numbers but prohibit division by zero?

828 views

Couldn’t the result of division by zero be “defined”, just like the square root of -1?

Edit: Wow, thanks for all the great answers! This thread was really interesting and I learned a lot from you all. While there were many excellent answers, the ones that mentioned Riemann Sphere were exactly what I was looking for:

https://en.wikipedia.org/wiki/Riemann\_sphere

TIL: There are many excellent mathematicians on Reddit!

In: 1691

19 Answers

Anonymous 0 Comments

In short: Division by zero is not useful.

When we work with complex numbers, they behave in a predictable and consistent manner. They might be as “imaginary” as division by zero is, but the difference is that if you accept imaginary numbers, then you have a functional framework you can use to model all kinds of things. When we use complex numbers to do, say, signal processing, the math works out intuitively and the end result corresponds to what we see in reality.

If you define division by zero as something, then that framework falls apart. Since multiplication is the inverse of division, shoehorning in zero in the wrong spot means that you have to start explaining how x*0 could be equal to something other than zero.

I’m sure other people will provide a dozen and a half explanations for why division by zero results in weird things, but the only thing keeping us from just accepting the weirdness just like we accept the concept of “negative area” with complex numbers is that going down that rabbit hole doesn’t yield anything really useful to us.

You are viewing 1 out of 19 answers, click here to view all answers.