Math, asked by cherry4444, 10 months ago

Prove that 1-2=0 and 1+2=4​

Answers

Answered by MRsteveAustiN
3

Answer:

1-2=0....(1)

1+2=4....(2)

from 1 and 2

-1=3

3+1=0

4=0

Answered by wasifthegreat786
1

Answer:

Step-by-step explanation:

How to "Prove" That 2 = 1

Let's begin our journey into the bizarre world of apparently correct, yet obviously absurd, mathematical proofs by convincing ourselves that 1 + 1 = 1. And therefore that 2 = 1. I know this sounds crazy, but if you follow the logic (and don't already know the trick), I think you'll find that the "proof" is pretty convincing.

Here's how it works:

Assume that we have two variables a and b, and that: a = b

Multiply both sides by a to get: a2 = ab

Subtract b2 from both sides to get: a2 - b2 = ab - b2

This is the tricky part: Factor the left side (using FOIL from algebra) to get (a + b)(a - b) and factor out b from the right side to get b(a - b). If you're not sure how FOIL or factoring works, don't worry—you can check that this all works by multiplying everything out to see that it matches. The end result is that our equation has become: (a + b)(a - b) = b(a - b)

Since (a - b) appears on both sides, we can cancel it to get: a + b = b

Since a = b (that's the assumption we started with), we can substitute b in for a to get: b + b = b

Combining the two terms on the left gives us: 2b = b

Since b appears on both sides, we can divide through by b to get: 2 = 1

Wait, what?! Everything we did there looked totally reasonable. How in the world did we end up proving that 2 = 1?

What Are Mathematical Fallacies?

The truth is we didn't actually prove that 2 = 1. Which, good news, means you can relax—we haven't shattered all that you know and love about math. Somewhere buried in that "proof" is a mistake. Actually, "mistake" isn't the right word because it wasn't an error in how we did the arithmetic manipulations, it was a much more subtle kind of whoopsie-daisy known as a "mathematical fallacy."

It's never OK to divide by zero!

What was the fallacy in the famous faux proof we looked at? Like many other mathematical fallacies, our proof relies upon the subtle trick of dividing by zero. And I say subtle because this proof is structured in such a way that you might never even notice that division by zero is happening. Where does it occur? Take a minute and see if you can figure it out…

OK, got it?

It happened when we divided both sides by a - b in the fifth step. But, you say, that's not dividing by zero—it's dividing by a - b. That's true, but we started with the assumption that a is equal to b, which means that a - b is the same thing as zero! And while it's perfectly fine to divide both sides of an equation by the same expression, it's not fine to do that if the expression is zero. Because, as we've been taught forever, it's never OK to divide by zero!

Why Can't You Divide By Zero?

Which might get you wondering: Why exactly is it that we can't divide by zero? We've all been warned about such things since we were little lads and ladies, but have you ever stopped to think about why division by zero is such an offensive thing to do? There are many ways to think about this. We'll talk about two reasons today.

Infinity LightbulbThe first has to do with how division is related to multiplication. Let's imagine for a second that division by zero is fine and dandy. In that case, a problem like 10 / 0 would have some value, which we'll call x. We don't know what it is, but we'll just assume that x is some number. So 10 / 0 = x. We can also look at this division problem as a multiplication problem asking what number, x, do we have to multiply by 0 to get 10? Of course, there's no answer to this question since every number multiplied by zero is zero. Which means the operation of dividing by zero is what's dubbed "undefined."

The second way to think about the screwiness of dividing by zero—and the reason we can't do it—is to imagine dividing a number like 1 by smaller and smaller numbers that get closer and closer to zero. For example:

1 / 1 = 1

1 / 0.1 = 10

1 / 0.01 = 100

1 / 0.001 = 1,000

1 / 0.0001 = 10,000

...

1 / 0.00000000001 = 100,000,000,000

and so on forever. In other words, as we divide 1 by increasingly small numbers—which are closer and closer to zero—we get a larger and larger result. In the limit where the denominator of this fraction actually becomes zero, the result would be infinitely large.

Which is yet another very good reason that we can't divide by zero. And why 1 + 1 is indeed equal to 2…no matter what our screwy "proof" might say.

Similar questions