if (a+1/a)^2=3 then show that a^3+ 1/a^3= 0.
Answers
Answered by
30
a^3 +b^3 = a^2 - ab + b^2
(a^3+ 1/a^3) = ( a+ 1/a) ( a^2 -1 +1/a^2) (1) equation
it is given that
(a+1/a)^2 = 3
or
( a+ 1/a) = √3. (2) equation
Also
( a+ 1/a)^2 = 3
or - a^2 + 1/a^2 +2 =3
then ( transpose 2 to subtract 3)
a^2 + 1/a^2 = 3-2
a^2 + 1/a^2 = 1. (3) equation
putting 2 and 3 in 1 we have
a^3 +1/a^3 = √3 (1-1) = 0
so the answer is 0
hope it helps
thank you
jolly13:
first line...a^3+ b^3 its formula is (a+b)^3-3ab(a+b).
Answered by
38
Answer:
Step-by-step explanation:
(a+1/a)^2 = 3
a+1/a = root3
Cubing both sides, we get
a^3 + 1/a^3 + 3(a+1/a) =3root3 [by using formula: (a+b)^3= a^3 +b^3 +3ab(a+b)]
a^3 + 1/a^3 + 3root3(value of a+1/a) = 3root3
a^3 +1/a^3+3root3-3root3=0
a^3+1/a^3= 0(as 3root3 of different signs cancel each other)
Hence proved
Similar questions