Margin is another essential metric used in Active Learning. It uses the probabilities of different classes for each image and finds the difference between the two classes with the highest probabilities. The image with the lowest margin is suggested for labeling. 

Consider an example where we have two instances with the following class probabilities:

  • Instance A: “cat” – 0.5, “milkshake” – 0.45, “cloud” – 0.05.
  • Instance B: “cat” – 0.4, “milkshake” – 0.3, “cloud” – 0.3.

Here, in comparison to the Variance metric, the model will choose instance A over B:

  • Margin of A = 0.5 – 0.45 = 0.05
  • Margin of B = 0.4 – 0.3 = 0.1

Even though class probabilities of “cat” and “milkshake” are higher in A (0.5 and 0.45) than in B (0.4 and 0.3), the model is more likely to confuse them since the Margin in A is very low.

Source

Learn more about the other heuristics:

Boost model performance quickly with AI-powered labeling and 100% QA.

Learn more
Last modified