Loss Function

Cross-entropy

It describes the loss between two probability distributions.

\[H(p,q) = - \sum_x p(x) \log q(x)\]

Cross-entropy sums over every class.

Read: Stackoverflow