Difference between log loss and cross entropy and hinge loss
Answers
Answered by
0
The main difference between thehinge loss and the cross entropy loss is that the former arises from trying to maximize the margin between our decision boundary and data points - thus attempting to ensure that each point is correctly and confidently classified*
Similar questions