IoU (Intersection over Union)

IoU is used to estimate how well a predicted mask or bounding box matches the ground truth data.

IoU also known as Jaccard index or Jaccard similarity coefficient.

Interpretation / calculation

The IoU is calculated by dividing the overlap between the prediction and ground truth label by the union of these.

The output is a percentage indicating the overlap between the two labels.

Code implementation

Numpy

import numpy as np

SMOOTH = 1e-6

def iou_numpy(outputs: np.array, labels: np.array):
    outputs = outputs.squeeze(1)

    intersection = (outputs & labels).sum((1, 2))
    union = (outputs | labels).sum((1, 2))

    iou = (intersection + SMOOTH) / (union + SMOOTH)

    thresholded = np.ceil(np.clip(20 * (iou - 0.5), 0, 10)) / 10

    return thresholded  # Or thresholded.mean()

PyTorch

import torch

SMOOTH = 1e-6

def iou_pytorch(outputs: torch.Tensor, labels: torch.Tensor):
   \# You can comment out this line if you are passing tensors of equal shape
   \# But if you are passing output from UNet or something it will most probably
    \# be with the BATCH x 1 x H x W shape
    outputs = outputs.squeeze(1)  # BATCH x 1 x H x W => BATCH x H x W

    intersection = (outputs & labels).float().sum((1, 2))  # Will be zero if Truth=0 or Prediction=0
    union = (outputs | labels).float().sum((1, 2))         # Will be zzero if both are 0

    iou = (intersection + SMOOTH) / (union + SMOOTH)  # We smooth our devision to avoid 0/0

    thresholded = torch.clamp(20 * (iou - 0.5), 0, 10).ceil() / 10  # This is equal to comparing with thresolds

    return thresholded  # Or thresholded.mean() if you are interested in average across the batch

Get AI confident. Start using Hasty today.

Our platform is completely free to try. Sign up today to start your two-month trial.