# MultiStepLR

MultiStepLR is a scheduler that decays the learning rate with a certain multiplicative factor set by the user when a certain milestone is reached. The milestone is the number of epochs which is set by the user.

### Milestones

They are a pair of values that sets the number of epochs after which the learning rate is scaled.

### Gamma

It is the multiplicative factor by which the learning rate is scaled.

### Mathematical Demonstration

Let us demonstrate the functioning of the MultiStepLR with a simple calculation.

If Milestones are set to be 30 and 80 base learning rate being 0.05, and gamma 0.1, then

for, €€0<=epoch<30€€ , €€lr=0.05€€

for, €€30<=epoch<80€€ , €€lr=0.05*0.1=0.005€€

for , €€epoch>=80€€ , €€lr=0.05*0.1^2=0.0005€€

### Code Implementation


import torch
scheduler=torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[30,80], gamma=0.1, last_epoch=-1, verbose=False)
for epoch in range(20):
for input, target in dataset: