Spoiler Alert: This post contains information that gives away the solution to one of the current puzzles.
Most of us were told by our math teachers that we are not allowed to divide by 0, but the only reason, if any, they gave us for this rule was the intuitive argument that there was no way to make sense of the idea of dividing a quantity into 0 equal parts.
We have no trouble dividing 12 apples into 6 portions (2 each) or 3 portions (4 each) or even 24 portions (½ apple each). We can even make sense of dividing by ½ using the intuitive understanding of division. In each case already considered, the reasoning is that if 12 apples comprise 6 (or 3 or 24) portions, then one portion must be 2 (or 4 or ½) apples. Likewise, if 12 apples comprise ½ portion, then one portion must be 24 apples. But if we are told that 12 apples comprise 0 portions, how can we determine how many apples comprise one portion?
While this is not bad as an intuitive explanation, there are plenty of phenomena in the Universe and plenty of ideas in math that are either counter-intuitive or that can’t be explored intuitively at all. In any case, mathematicians, as well as all others who place their trust in logical or empirical methodology, regard intuition as only a muse, pointing us in, hopefully, the right direction, but not eliminating the need to actually make the journey, step-by-step, leaving footprints that can be followed, requiring no leaps of faith (or intuition).
As it happens, because the mathematical definition of division is a formalization of our intuitive understanding, the logical explanation of why division by zero is prohibited is very similar to the intuitive explanation, except that using a precise definition eliminates the need for intuition.
Subtraction is defined in terms of addition; “x – y = z ” means (is defined as) “x = y + z “, so if we are asked to determine the value of x – y, we do so by determining what value of z makes “x = y + z ” true. Specifically, “6 – 2 = x ” means “6 = 2 + x ” and since 6 = 2 + 4, x must be 4, so 6 – 2 = 4 by definition. Similarly, division is defined in terms of multiplication; “x ÷ y = z ” means “x = yz “, so if we are asked to determine the value of x ÷ y, we do so by determining what value of z makes “x = yz ” true. Specifically, “6 ÷ 2 = x ” means “6 = 2x “, and since 6 = 2•3, x must be 3, so 6 ÷ 2 = 3 by definition.
However, if we are asked to determine the value of 6 ÷ 0, we find that there is no value of x that makes “6 = 0x ” true, because 0 times any number is 0, so we say that 6 ÷ 0 has no value, or that “6 ÷ 0” is meaningless or undefined.
Most high school textbooks say nothing specifically about “0 ÷ 0”, so the reader is likely to make no distinction between dividing 0 by 0 and dividing a non-zero number by 0. Let’s see if that assumption holds water.
If we are asked to determine the value of 0 ÷ 0, to be consistent, we would have to determine what value of x makes “0 = 0x ” true. Well, 0 = 0•1, so x = 1. And 0 = 0•2, too, so x = 2. And if x = 1 and x = 2, then 1 = 2. And by now, you should be way ahead of me and realize that every value of x makes “0 = 0x ” true, so x equals every number, so all numbers are equal. Since every value fits the definition of 0 ÷ 0 equally, we cannot say that 0 ÷ 0 is not defined, but that using the definition does not allow us to determine a distinct value for it, so 0 ÷ 0 is indeterminate.
Before we get to calculus, there is no context that requires us to distinguish between these two situations, so our teachers and textbooks avoid the issue by saying simply, “Thou shalt not divide by zero.” But isn’t it better to understand why we have the rules that we have, whether those rules are mathematical or legal or moral, etc., than to have the rules simply imposed on us?