Math, asked by aasthapatel6963, 1 year ago

How does cross entrpy finds difference between two distributions

Answers

Answered by Anonymous
0

Cross-entropy is commonly used to quantify the difference between two probability distributions. So that is how "wrong" or "far away" your prediction is from the true distribution. Cross entropy is one out of many possible loss functions (another popular one is SVM hinge loss).

Answered by deeksha7790
26

How does cross entrpy finds difference between two distributions?

Answer - Cross-entropy is commonly used to quantify the difference between two probability distributions. ... So that is how "wrong" or "far away" your prediction is from the true distribution. Cross entropy is one out of many possible loss functions (another popular one is SVM hinge loss).

HOPE IT HELPS!

Similar questions