if(a+1 divided by a)^2=3 and a not equal to 0;show that a^3+1 divided by a^3 using expansions
Answers
Answered by
1
Answer:
0 & ±6√3
Step-by-step explanation:
⇒ Taking a + 1/a = √3,
⇒ Taking a + 1/a = -√3,
⇒ Thus the answer is 0 and ±6√3.
Similar questions
English,
6 months ago
Social Sciences,
1 year ago
Math,
1 year ago
Math,
1 year ago
Math,
1 year ago