Why Dividing by Zero Breaks Math and Teachers Warn You for it

From a young age, we all learn in school that you can’t divide by zero. It is often treated as a mortal sin in math class. However, teachers usually say it’s undefined and move on. But why is it forbidden? Why is dividing by zero so dangerous that math itself breaks down if we try it? The goal of this post is to clear out the confusion about dividing by zero.

So let’s start from the very basics. We all know that division is the opposite of multiplication. From algebra class, we can say that $6$ divided by $3$ is $2$ because $3\times 2=6$. It becomes more difficult if we ask what is $5$ divided by $0$, we’re really asking what number times $0$ equals $5$. But since $0 \times n = 0$ for every number $n$, we cannot answer this question easily.

Let’s see it in another way, suppose you have baked a complete cake and you want to know how large of a piece each person gets. You have a cake of size $1$ and $N$ guests at your party. This means that each person will get $\frac{1}{N}$ of the complete cake. You can also turn it around. If you want to divide the cake among a larger amount of people an option is to simply give smaller pieces.

So you have a piece size of $\epsilon$ then for a smaller piece you can give more people a piece. But what if you make $\epsilon$ smaller? Then the amount of people that can get a piece of the cake grows. The amount of people that can be fed considering $\epsilon$ can be determined to be $1 \div \epsilon $. So let’s now make $\epsilon$ smaller.

$$ 1 \div \epsilon = 1 \div 0,1 = 10$$ $$ 1 \div 0,01 = 100$$ $$ 1 \div 0,001 = 1000$$ $$ 1 \div 0,0001 = 10000$$

You see that for a smaller $\epsilon$ the amount of people receiving a piece grows. Essentially we are taking a limit $\epsilon \rightarrow 0$ which means that we let $\epsilon$ reach $0$ and the amount of cake receivers grows larger and larger for a smaller $\epsilon$. We can say that the limit for $\epsilon \rightarrow 0$ goes to $+\infty$.

As dividing by zero is wrong because our math does not make sense anymore, a limit is some kind of controlled environment where we say “Ok, what we do here might be wrong fundamentally but let’s what happens if we still try to do something cursed like dividing by zero.” In more formal terms we say: $$ \lim_{\epsilon \to 0} \frac{1}{\epsilon} $$ First time you see this, this might seem intimidating but don’t worry: there is a solid base for this. Of course if we make $\epsilon$ smaller then at some point our reasoning does not make any sense anymore: the pieces cannot be made smaller than mere atoms, but it is the idea that is important here. But, what about $0 \div 0$? This one is trickier. This is asking what number times 0 equals 0. As we discussed before, every number multiplied by zero obtains zero! $$ 1 \times 0 = 0$$ $$ 1000 \times 0 = 0$$ $$ 10\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000\,000 \times 0 = 0$$ I think I made my point. So $0 \div 0$ doesn’t have one answer, it has infinitely many. That’s why it’s called indeterminate.

Okay, but why do we need these fancy limits? Why can’t we just say that $1 \div 0 = \infty$? Ok, let’s suppose this then: $$1 = \infty \times 0$$ But, then math becomes inconsistent. Let’s explain this with the cake example. You invite $N$ people to your cake party and you are very generous so you give each person a cake. So you have $N$ cakes and $N$ people easy peasy, everyone gets one cake!

Ok, but what if each guest then also invites $N$ people to also go to your party. Then you end up with $N\times N$ guests at your party and each guest has to share the cake with the other present guests.

You want to throw a very extravagant party so you invite a very large amount of guests? Let’s see what happens if you take the limit of the amount of cake each present guest gets when $N\rightarrow \infty$.

In the first situation you get the following limit. This is a situation of $0\times\infty$ because you have $N$ cakes and $N$ guests but it cancels out because the amount of guests grows as fast as you provide cakes. $$ \lim_{N \to \infty} N \frac{1}{N} =1 $$ The second situation is different. The invited guests bring more guests than you provisioned! This means that if you invite more guests in this situation each person will get less cake for a larger amount of guests. In mathematical terms: $$ \lim_{N \to \infty} N \frac{1}{N^2} = \lim_{N \to \infty} \frac{1}{N} =0 $$ Also note that this is also a situation of $0\times\infty$, but this time the result is different. This means that it is dangerous to just say that $1 = \infty \times 0$. Hence, we really need our limits to create a safe-space so that we don’t create contradictions. Otherwise we couldn’t trust any result anymore. If we let division by zero be allowed, math collapses. Division by zero isn’t just a rule teachers made up to haunt you. It’s a safeguard. Without it, arithmetic would crumble under contradictions. A nice illustration of this was given in the meme on my Instagram account. The following reasoning was given to "prove" that $1=2$.

Do you see where this reasoning is wrong? Don't worry if you don't see it right away. It is quite subtle the first time you see this kind of "proofs". The first time I saw this, I was utterly confused. Is math all wrong? Was it all a dream? Don't worry. The mistake is going from the fourth line to the fifth line. In the beginning we assumed that $a=b$ but when going from the $4$th to the $5$th line we actually divide by $a-b=0$. This is the crucial mistake and a great illustration of why you should not directly divide by zero!

The struggle with dividing by zero isn’t new. As early as the $7$th century, the Indian mathematician Brahmagupta tried to formalize rules for zero and even wrote down expressions involving division by zero. However, as we now know that this might lead to malpractice. Hence, not all of his rules did work consistently. Centuries later, the English mathematician John Wallis, who introduced the infinity symbol $\infty$, suggested that dividing by smaller and smaller numbers was like reaching infinity. That’s is what we did today! However, our story is not over yet. Only with the invention of calculus in the $17$th century did mathematicians understand how to rigorously handle these situations using limits, but that is a story for a future post.

Comments

Popular posts from this blog

How to Win a Sportscar Using Probability

To divide or not to divide by 3?

Can Ai escape the lab?