Claude Monet 'Morning on the Seine near Giverny'
Entropy measures the level of disorder in a system. Cross-Entropy Loss measures the performance of a classification model.
Some excellent Medium Articles address Loss Functions in machine learning.
Cross-Entropy Loss Function
Common Loss functions in machine learning
When we predict events based on probability, the log loss penalizes predictions based on the difference between the actual value. If we know the true value and the model gets it wrong, then the log loss has a higher value. If the real value is a random distribution, then the log loss will penalize intermediate guesses more than absolute guesses.
Notice how the specific log loss increases as we incorrectly predict a value of 0 when it should be one. Also, notice how the random distribution of values has a low log loss for 0 and 1 and higher for uncertain (0.5) guesses.
Comments