Cross-entropy is commonly used to quantify the difference between two probability distributions.
Cross-entropy loss measures how close is the predicted distribution to the true distribution.
Why the Negative Sign?
Log Loss uses negative log to provide an easy metric for comparison. It takes this approach because the positive log of numbers < 1 returns negative values, which is confusing to work with when comparing the performance of two models.
