Prove that the roots of the equation bx2 + (b-c)x + (b-c-a) = 0 are real if those of ax2 + 2bx + b = 0 are imaginary
Answers
Answered by
28
Now let's examine the discriminant of the first equation. It is
(b-c)^2 - 4b(b-c-a)
= b^2 - 2bc + c^2 - 4b^2 + 4bc + 4ab
= b^2 + 2bc + c^2 - (4b^2 - 4ab)
= (b + c)^2 - (4b^2 - 4ab).
Since 4b^2 - 4ab < 0, we have a nonnegative term minus a negative term, so the result must always be positive. Hence the roots of the first equation must be real and distinct.
(b-c)^2 - 4b(b-c-a)
= b^2 - 2bc + c^2 - 4b^2 + 4bc + 4ab
= b^2 + 2bc + c^2 - (4b^2 - 4ab)
= (b + c)^2 - (4b^2 - 4ab).
Since 4b^2 - 4ab < 0, we have a nonnegative term minus a negative term, so the result must always be positive. Hence the roots of the first equation must be real and distinct.
Answered by
1
Now let's examine the discriminant of the first equation. It is
(b-c)^2 - 4b(b-c-a)
= b^2 - 2bc + c^2 - 4b^2 + 4bc + 4ab
= b^2 + 2bc + c^2 - (4b^2 - 4ab)
= (b + c)^2 - (4b^2 - 4ab).
Since 4b^2 - 4ab < 0, we have a nonnegative term minus a negative term, so the result must always be positive. Hence the roots of the first equation must be real and distinct.
Similar questions