Multi-Class Cross Entropy
Table of Contents
Our Probabilities
Weh have three doors behind which could be one of three animals. These are the probabilities that if you open a door, you will find a particular animal behind it.
Animal | Door 1 | Door 2 | Door 3 |
---|---|---|---|
Duck | \(P_{11}\) | \(P_{12}\) | \(P_{13}\) |
Beaver | \(P_{21}\) | \(P_{22}\) | \(P_{23}\) |
Walrus | \(P_{31}\) | \(P_{32}\) | \(P_{33}\) |
\[ \textit{Cross Entropy} = - \sum^n_{i=1} \sum^m_{j=1} y_{ij} \ln (p_{ij}) \]
So, what does this mean?
Cross Entropy is inversely proportional to the the total probability of an outcome - so the higher the cross entropy you calculate, the less likely it is that the outcome you are looking at will happen.