Math, asked by gurdeepkaur511, 11 months ago

if(a+1/a)^2=3 then show that a^3+1/a^3=0​

Answers

Answered by devraaz170
1

Step-by-step explanation:

 {(a +  \frac{1}{a} )}^{2}  = 3 \\  \\  {a}^{3}  +  \frac{1}{ {a}^{3} }  \\  =  {(a +  \frac{1}{a} )}( {a}^{2}  -  a \times  \frac{1}{a}  +  \frac{1}{ {a}^{2} } ) \\  =  \sqrt{3}  \times ((a +  \frac{1}{a} ) ^{2}  - 2 \times a \times  \frac{1}{a}  - 1) \\  =  \sqrt{3} \times (3 - 3) \\  =  \sqrt{3}  \times 0 \\  = 0

proved.

please mark as brainliest

Similar questions