Why is it mathematically consistent to allow imaginary numbers but prohibit division by zero?

814 views

Couldn’t the result of division by zero be “defined”, just like the square root of -1?

Edit: Wow, thanks for all the great answers! This thread was really interesting and I learned a lot from you all. While there were many excellent answers, the ones that mentioned Riemann Sphere were exactly what I was looking for:

https://en.wikipedia.org/wiki/Riemann\_sphere

TIL: There are many excellent mathematicians on Reddit!

In: 1691

19 Answers

Anonymous 0 Comments

There is a widely-used mathematical domain where division by zero is allowed, including 0/0.

It’s called [IEEE 754](https://en.wikipedia.org/wiki/IEEE_754).

In this domain you have a finite set of rationals, ±∞, ±0, and NaN.

Any non-zero divided by zero gives one of the infinities, and zero divided by zero gives NaN. Any operation involving NaN also gives NaN, and NaN does not equal itself.

This can result in some very non-intuitive behaviour, but is the system underpinning vast amounts of computing.

You are viewing 1 out of 19 answers, click here to view all answers.