Skip to content

Latest commit

 

History

History
12 lines (7 loc) · 644 Bytes

File metadata and controls

12 lines (7 loc) · 644 Bytes

Machine Learning Interview Questions

Cross Entropy or Log Loss

Cross-entropy is commonly used to quantify the difference between two probability distributions.
Cross-entropy loss measures how close is the predicted distribution to the true distribution.

Why the Negative Sign?
Log Loss uses negative log to provide an easy metric for comparison. It takes this approach because the positive log of numbers < 1 returns negative values, which is confusing to work with when comparing the performance of two models.