Cross Entropy

Cross Entropy In machine learning, the cross-entropy loss is a very popular loss function. Here’s the math definiton of it: $$ H(P^* | P) = - \sum_{i} P^*(i) log P(i) $$ I have been using cross entropy for decades without truly understanding this loss function. Recently, I watched a few YouTube videos and want to share my most recent understanding of it. Essentially, cross entropy is very useful for measuring the difference between two distributions. ...

February 23, 2025