Math, asked by nainamishra1, 1 year ago

(a+1÷a)^2=3 where a is not equal to 0 show that a^3+1÷a^3=0

Answers

Answered by rakeshmohata
3
Hope u like the process,
-------------------------------------
HERE WE Go,

Formula to be used:
———————————
x³ + y³ = (x+y) (x² - xy +y²)
(x +y) ² = x² + y² +2xy
~~~~~~~~~~~~~~~~~~~~~~

So,

(a + 1/a)² = 3

or, a² + 1/a² +2.a.1/a = 3

or, a² + 1/a² = 3-2

or, a² + 1/a² = 1
----------------------
NOW,

-->a³ + 1/a³
= (a +1/a) (a² + 1/a² - a. 1/a)
= ( a +1/a) ( 1 - 1) __(since a² +1/a² =1)_
= 0× ( a +1/a) =0 __(proved) __
-_-_-_-_-_-_-_-_-_-_-
Hope u got it.

Proud to help you.

rakeshmohata: welcome dear.
Similar questions