Let a and b each be equal to 1. Since a and b are equal,
b^2 = ab
Since a equals itself, it is obvious that
a^2 = a^2
Subtract equation 1 from equation 2. This yields
a^2 - b^2 = a^2 - ab
We can factor both sides of the equation; a^2 - ab equals a(a - b). Likewise, a^2 - b^2 equals (a + b)(a - b). Substituting into equation 3 we get
(a + b)(a - b) = a(a - b)
So far, so good. Now divide both sides of the equation by (a - b) and we get
a + b = a
Subtract a from both sides and we get
b = 0
But we set b to 1 at the very beginning of this proof, so this means that
1 = 0
With normal algerbraic rules we know that if we add one to both sides, then both sides should still be equal, so...
1+1=0+1
2=1
Credits to the book "Zero, the biography of a dangerous idea".
You can't it equals 2. You can't it equals 2.
Using a calculator
Cannot prove that 2 divided by 10 equals 2 because it is not true.
This is a very difficult philosophical question. The best way to look at it is that 2 is defined as 1 plus 1 ! (If it isn't, how do you define 2?)
a0=(a-1\a-1)=a\a=1
1 does not equal 2. There are many supposed proofs that work on the assumption that readers will not notice an attempt to divide both sides by zero.
Using faulty logic.
a) Everyone knows what one and two are, so they will know 1 + 1 = 2 b) Assuming they do not know what one and two are, it will be impossible to explain to them 1 + 1 = 2, because these definitions are ESSENTIAL to prove 1 + 1 = 2.
no if 1 plus 1 equals 2 then 2 plus 2 equals 4.
No, because technically, it is not true.
Is the following what you are claiming? 2k = 2k+1 -1 20 not equal 20+1 - 1 21 not equal 21+1 -1
2 times 1 equals 2.