Cross Entropy Loss

The cross entropy between two probability distributions over the same underlying set of events measures the average number of bits needed to identify an event drawn from the set.

Cross entropy can be used to calculate loss. The equation for cross entropy loss is:

Regularization

Regularization is the process of introducing additional information to prevent overfitting and reduce loss, including:

An Example

An example is: