Department of Energy Argonne National Laboratory Office of Science NEWTON's Homepage NEWTON's Homepage
NEWTON, Ask A Scientist!
NEWTON Home Page NEWTON Teachers Visit Our Archives Ask A Question How To Ask A Question Question of the Week Our Expert Scientists Volunteer at NEWTON! Frequently Asked Questions Referencing NEWTON About NEWTON About Ask A Scientist Education At Argonne Algebra Quandry
Name: Christopher S.
Status: student	
Age:  N/A
Location: N/A
Country: N/A
Date: N/A 

Here is a "proof" that I have found.  I am only using the term "proof" in
a vague
sort of tongue-in-cheek kind of way.

1)     x = y
2)     x^2 = xy      (multiply both sides by x)
3)     x^2 - y^2 = xy - y^2  (subtract y^2 from each side)
4)     (x + y)(x - y) = y(x - y)     (factor)
5)     x + y = y     (divide out (x - y))
6)     2y = y      (substitute y for x from #1)
7)     2 = 1      (What do you think of that?!)

Now, here is my philosophical quandary:  I had taken algebra to be a means of
considering numerical relationships regardless of concrete values.  But if
the "proof" (i.e., with the divide by zero) is not valid because we
have to watch our values, then doesn't this imply that algebra
(specifically) and other branches of mathematics are really not thoroughly
abstract?  I.e., must we not always "watch our values" in any branch of
mathematics?  And how does this reflect on the assertion of antinomies and
incompletenesses in the various formal mathematical systems?

As I see it, one does not need to watch the values if algebraic operations involve definite values. Dividing two sides of an equation by zero is the same as multiplying both sides by infinity - an undetermined value. Any results hence derived are invalid, as your example shows.


This is not really a mathematical crisis. It follows quite naturally from the definition of the of the operation we call multiplication, or put another way, the definition we give to the symbol "1".

To keep things simple consider the integers, although that is not really necessary. The operation of multiplication is a shorthand form of addition. So by the operation of multiplication of 'X' times 'Y' which we write as: X*Y we mean X+X+X ... Y-times. So, X*1 = X. When we write 'X' times '0', or in the usual notation, X*0 we mean X+X+X zero times. Written in the conventional way X*0=0. Or the other way around: 0+0+0... X-times is still '0'.

Now division is not an operation separate from multiplication, it is the inverse of the operation of multiplication, by which we mean: 'X' times 'Xinverse' = 1. But using our long handed definition that 'X' times 'Y' means X+X+X... Y times. Then 'X' times 'Xinverse' means X+X+X... 'Xinverse' times =1. But if we say 'Xinverse' is '0' then we mean: X+X+X... zero times =1 or X*0=1 and we have infected the operations with a contradiction because X*0=0 and X*0=1 cannot coexist in the same logical construction.

This detailed line or argument is summarized by saying, "Division by zero is not defined."

If someone wanted to redefine the arithmetic operations in an alternative way so that, "Division by zero IS defined." That would be OK provided no internal contradications were introduced. But THAT ALGEBRA would not be the one we associate with numbers as we know them.

There are cases where seemingly weird definitions do exist:

The factorial operation, usually written: n! = 1*2*3*4...*n, but 0!=1 BY DEFINITION.

There is also a function called the Dirac delta function, let's call it DDF(x) for short, that occurs in quantum mechanics. It is defined as follows: DDF(x)=0 for x not equal zero, and DDF(0)=1. And the integral under the "curve" of the DDF(x) from negative infinity to positive infinity =1.

Vince Calder


In such mathematics, the only thing you have to watch is division by zero.
This is because division is not defined whne the divisor equals zero.  If
you do not want to watch such things, don't divide.  One way "out of it" is
to stop at factoring:
         (x+y)(x-y) = y(x-y)             subtract y(x-y)
         (x+y - y)(x-y)=0                        simplify
         Either x=0 or x-y=0

You can multiply, add, or subtract anything.  Division is not defined for
ALL real numbers, so it can not be used as freely.

Dr. Ken Mellendorf

Yes, you must "watch your values". Mathematics is a set of ideas built up from fundamental postulates. One of these is for algebra that you cannot divide by zero and get anything meaningful. The math is only as good as the person following the rules.

Steve Ross

Click here to return to the Mathematics Archives

NEWTON is an electronic community for Science, Math, and Computer Science K-12 Educators, sponsored and operated by Argonne National Laboratory's Educational Programs, Andrew Skipor, Ph.D., Head of Educational Programs.

For assistance with NEWTON contact a System Operator (, or at Argonne's Educational Programs

Educational Programs
Building 360
9700 S. Cass Ave.
Argonne, Illinois
60439-4845, USA
Update: June 2012
Weclome To Newton

Argonne National Laboratory