It's known that all (real) numbers fulfill a/a=1, and 0/a=0.

So, for a=0, we have 0/0=1 and 0/0=0. Since 0/0 must be equal to itself, then 1 must be equal to 0.

From here we can take that 1+0 = 1. But since 1=0, we can follow that 1+1=1 (we just replaced the 0).

But we also know that 1+1=2. If 1+1=2 and 1+1=1, then it's obvious that 2=1, because 1+1=1+1

Of course, that's not true at all. But if someone says that there is

**an**error in the "proof" above, they will be wrong...

There are actually three gaps (a/a=1 and 0/a=0 are not true for all reals, but for all except zero; and 0/0 is an indetermination so it doesn't need to be equal to itself). I just hope you had some laugh with it

See you some other day, with my proof of PI=1 (that is among my favorites)

------

There are 10 kinds of people in the world: those who know binary, those who don't, and those who also know ternary.