Binary Cross-Entropy loss is a special case of Cross-Entropy loss used for multilabel classification (taggers). It is the cross entropy loss when there are only two classes involved. It is reliant on Sigmoid activation functions.
Mathematically, it is given as,
Where is the true label and is the probability of the label.
import torch
import torch.nn as nn
input = torch.randn(3, 5, requires_grad=True)
target = torch.ones([10, 64], dtype=torch.float32)
output = torch.full([10, 64], 1.5)
pos_weight = torch.ones([64])
criterion = torch.nn.BCEWithLogitsLoss(pos_weight=pos_weight)
criterion(output, target)
import tensorflow as tf
y_true = [[0., 1.], [0., 0.]]
y_pred = [[0.6, 0.4], [0.4, 0.6]]
bce = tf.keras.losses.BinaryCrossentropy()
bce(y_true, y_pred).numpy()
bce(y_true, y_pred, sample_weight=[1, 0]).numpy()
Accelerated Annotation.
Maximize model performance quickly with AI-powered labeling and 100% QA.
Learn more