Search
  • Tim Burns

Entropy and Log Loss


Claude Monet 'Morning on the Seine near Giverny'

Entropy measures the level of disorder in a system. Cross-Entropy Loss measures the performance of a classification model.


Some excellent Medium Articles address Loss Functions in machine learning.

  • Cross-Entropy Loss Function

  • Common Loss functions in machine learning

When we predict events based on probability, the log loss penalizes predictions based on the difference between the actual value. If we know the true value and the model gets it wrong, then the log loss has a higher value. If the real value is a random distribution, then the log loss will penalize intermediate guesses more than absolute guesses.



Notice how the specific log loss increases as we incorrectly predict a value of 0 when it should be one. Also, notice how the random distribution of values has a low log loss for 0 and 1 and higher for uncertain (0.5) guesses.

17 views0 comments

Recent Posts

See All

Here is the ticket master public API. https://developer.ticketmaster.com/ Could be a very interesting edition to cross-reference with Playlist info to get hot concerts that may be under the radar.

Elon Musk embodies the worst aspects of the tech bro spirit. He has no moral compass. He lacks compassion. He elevates the vilest voices in our world. He is not the kind of person I would want to