Math, asked by smtsingh588, 4 days ago

(1÷a)+(1÷b)=(1÷c) where greatest common factor (a,b,c)=1

Answers

Answered by dharmistharana654
0

ANS IS ONE HOPE YOU HELP TO YOU

Answered by venom0136
0

Step-by-step explanation:

Let gcd(a,b)=g, and a=a′g and b=b′g (so that gcd(a′,b′)=1). The equation 1a+1b=1c is the same as c(a+b)=ab, which, dividing throughout by g, is

c(a′+b′)=a′b′g.

Now, as (a′+b′) divides a′b′g but is relatively prime to both a′ and b′, it must divide g. Similarly, as g divides c(a′+b′) but is relatively prime to c (note that gcd(g,c)=gcd(gcd(a,b),c)=gcd(a,b,c)=1) it must divide (a′+b′). Thus as both g and (a′+b′) divide each other, we have

(a′+b′)=g,

and therefore

(a+b)=g(a′+b′)=g2.

The hypothesis may be written as (a+b)c=ab, which is equivalent to (a−c)(b−c)=c2.

Let p be a prime factor of c. Then p divides a−c or b−c, but it cannot divide both because gcd(a,b,c)=1. Hence p2 divides a−c, say. This means that both a−c and b−c are squares since their product is a square: a−c=u2, b−c=v2. This implies that c=uv. But then a+b−2c=u2+v2 and so a+b=(u+v)2.

The argument above means that all examples of

1a+1b=1c

with gcd(a,b,c)=1 are given by

1u(u+v)+1v(u+v)=1uv

with gcd(u,v)=1.

Similar questions