Couldn’t the result of division by zero be “defined”, just like the square root of -1?
Edit: Wow, thanks for all the great answers! This thread was really interesting and I learned a lot from you all. While there were many excellent answers, the ones that mentioned Riemann Sphere were exactly what I was looking for:
https://en.wikipedia.org/wiki/Riemann\_sphere
TIL: There are many excellent mathematicians on Reddit!
In: 1691
It is consistent, if you make it so. Sit down, say `x / 0 = q` or whatever, and see what happens. What is q multiplied by a number? What about adding it? And so on.
What you’re really asking is, why don’t I hear about this `q`, and why don’t mathematicians seem to use it?
Imaginary numbers were invented to solve a problem: mathematicians were trying to solve cubic polynomials and were constantly getting square roots of negative numbers. They decided, well what if that did work, and the imaginary numbers ended up canceling out so you got “normal” solutions. That’s why we know about imaginary numbers, they ended up being helpful for solving problems. (They also ended up having lots of applications in engineering and computer science, so lucky them I guess).
If you sit down and try to solve problems with your `q` you don’t really get anything useful. Assuming `x / 0 = q` causes too many contradictions to be helpful. For example, `q * 0` is equal to every number, which makes every number equal to each other. You can probably design some way out of this, but you can see how this is hard to use as a tool to solve problems.
The fun thing about math is that you can assume anything to be anything, the hard part is assuming something useful. Imaginary numbers are. Dividing by zero isn’t really.
Latest Answers