How does cross entrpy finds difference between two distributions
Answers
Answered by
0
Cross-entropy is commonly used to quantify the difference between two probability distributions. So that is how "wrong" or "far away" your prediction is from the true distribution. Cross entropy is one out of many possible loss functions (another popular one is SVM hinge loss).
Answered by
26
How does cross entrpy finds difference between two distributions?
Answer - Cross-entropy is commonly used to quantify the difference between two probability distributions. ... So that is how "wrong" or "far away" your prediction is from the true distribution. Cross entropy is one out of many possible loss functions (another popular one is SVM hinge loss).
HOPE IT HELPS!
Similar questions