Math, asked by jolly13, 1 year ago

if (a+1/a)^2=3 then show that a^3+ 1/a^3= 0.

Answers

Answered by Anonymous
30

a^3 +b^3 = a^2 - ab + b^2


(a^3+ 1/a^3) = ( a+ 1/a) ( a^2 -1 +1/a^2) (1) equation


it is given that

(a+1/a)^2 = 3

or
( a+ 1/a) = √3. (2) equation


Also

( a+ 1/a)^2 = 3

or - a^2 + 1/a^2 +2 =3

then ( transpose 2 to subtract 3)

a^2 + 1/a^2 = 3-2

a^2 + 1/a^2 = 1. (3) equation


putting 2 and 3 in 1 we have

a^3 +1/a^3 = √3 (1-1) = 0

so the answer is 0

hope it helps

thank you





jolly13: first line...a^3+ b^3 its formula is (a+b)^3-3ab(a+b).
jolly13: so hows it became a^2-2ab+b^2? can i know please..
Anonymous: it is the formula oaf asquare + bsquare
Anonymous: acube + b cube want to know the answer according to question
jolly13: okay.
Answered by rajeevsharma36
38

Answer:

Step-by-step explanation:

(a+1/a)^2 = 3

a+1/a = root3

Cubing both sides, we get

a^3 + 1/a^3 + 3(a+1/a) =3root3 [by using formula: (a+b)^3= a^3 +b^3 +3ab(a+b)]

a^3 + 1/a^3 + 3root3(value of a+1/a) = 3root3

a^3 +1/a^3+3root3-3root3=0

a^3+1/a^3= 0(as 3root3 of different signs cancel each other)

Hence proved

Similar questions