Show that
Answers
Answered by
1
Step-by-step explanation:
Let f(x)=(x−a)(x−b)(x−c). Then
f′(x)f(x)=1x−a+1x−b+1x−c.
Thus, the problem asks you to prove that f′(x)=0 has exactly two real roots not equal to a,b,c.
Since f(x) is cubic, f′ is quadratic, thus it has at most two real rules.
By Rolle Theorem, f′ has a root in (a,b) and a root in (b,c).
Similar questions