Skip to content

Cross-Entropy Loss [Multi-Class Classification]

Description

  • Use case: Classification problems (Multi-class problems)
  • When to use: Is used in multi-class classification tasks where the output represents the predicted probability distribution over multiple classes (softmax probabilities). It measures the dissimilarity between the predicted probability distribution and the actual distribution (one-hot encoded true labels).
  • Key property: Pushes the model to assign high probability to the correct class while minimizing the probability of other classes.
  • Example applications:

    • Image classification tasks (e.g., identifying objects like cars, cats, or trees in an image)
    • Natural Language Processing (e.g., sentiment analysis, text classification)

Formula

Cross-Entropy Loss=โˆ’1Nโˆ‘i=1Nโˆ‘c=1Cyi,clogโก(y^i,c)
  • C is the number of classes
  • yi,c is the true label (1 for the correct class, 0 otherwise)
  • y^i,c is the predicted probability for class c for the i-th example