< prev | next >

Cross Entropy Loss

The cross entropy between two probability distributions over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set.

Cross entropy can be used to calculate loss. It measures the difference between predicted and true probability distributions.

Regularization

Regularization is the process of introducing additional information to prevent overfitting and reduce loss, including:

References