If you have ever worked on a Computer Vision project, you might know that using a learning rate scheduler might significantly increase your model training performance. On this page, we will:

  • Сover the Exponential Learning Rate (ExponentialLR) scheduler;
  • Check out its parameters;
  • See a potential effect from ExponentialLR on a learning curve;
  • And check out how to work with ExponentialLR using Python and the PyTorch framework.

Let’s jump in.

The Exponential Learning Rate scheduling technique divides the learning rate every epoch (or every evaluation period in the case of iteration trainer) by the same factor called gamma. Thus, the learning rate will decrease abruptly during the first several epochs and slow down later, with most epochs running with lower values. The learning rate aspires to zero but never reaches it.

ExponentialLR formula. Source
ExponentialLR example. Source

At the last epoch, the algorithm sets the learning rate as the initial Base Learning Rate.

  • Gamma - a multiplicative factor by which the learning rate is decayed every epoch. For instance, if the learning rate is 1000 and gamma is 0.5, the new learning rate will be 1000 x 0.5 = 500.
The gamma value should be less than 1 to reduce the learning rate.
ExponentialLR in comparison to other schedulers.
Source
python
      import torch
model = [Parameter(torch.randn(2, 2, requires_grad=True))]
optimizer = torch.optim.AdamW(model.parameters(), lr=learning_rate, weight_decay=0.01, amsgrad=False)
scheduler = torch.optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.1, last_epoch=-1, verbose=False)
for epoch in range(20):
    for input, target in dataset:
        optimizer.zero_grad()
        output = model(input)
        loss = loss_fn(output, target)
        loss.backward()
        optimizer.step()
    scheduler.step()
    

Boost model performance quickly with AI-powered labeling and 100% QA.

Learn more
Last modified