Environmental Sciences, asked by neha1712, 10 months ago

Difference between logistic regression loss and cross entropy

Answers

Answered by paramita15092001
0

Answer:

Hey, dear, here is your answer

Explanation:

Logistic regression is a classification algorithm used to assign observations to a discrete set of classes. Unlike linear regression which outputs continuous number values, logistic regression transforms its output using the logistic sigmoid function to return a probability value which can then be mapped to two or more discrete classes.

Cross-entropy loss, or log loss, measures the performance of a classification model whose output is a probability value between 0 and 1. Cross-entropy loss increases as the predicted probability diverges from the actual label.

Similar questions