Why does cross entrpy finds difference between two distributions
Answers
Answered by
0
Cross entropy is one out of many possible loss functions (another popular one is SVM hinge loss) correct, cross entropy described the loss between two probability distribution. It is one of many possible loss functions. Then we can use, for example, qradient desent alglothim to find the minimum.
Similar questions