DeepLearning

Learning Rate Scheduler

jiheek 2022. 6. 21. 14:00

 

1. Constant

class ConstantScheduler:
    def __init__(self, steps, lr):
        self.steps = steps
        self.lr = lr

    def __call__(self, step):
        return self.lr

 

2. Multistep

 

 

3. Linear

class LinearScheduler:
    def __init__(self, steps, lr_start, lr_end):
        self.steps = steps
        self.lr_start = lr_start
        self.lr_end = lr_end

    def __call__(self, step):
        return self.lr_start + (self.lr_end -
                                self.lr_start) * step / self.steps

 

4. Cosine

class CosineScheduler:
    def __init__(self, steps, base_lr, lr_min_factor=1e-3):
        self.steps = steps
        self.base_lr = base_lr
        self.lr_min_factor = lr_min_factor

    def __call__(self, step):
        return self.base_lr * (self.lr_min_factor +
                               (1 - self.lr_min_factor) * 0.5 *
                               (1 + np.cos(step / self.steps * np.pi)))

 

5. sgdr

 

 

6. Warmup

 

 

 


Source

코드: https://github.com/hysts/pytorch_image_classification

https://www.kaggle.com/code/isbhargav/guide-to-pytorch-learning-rate-scheduling/notebook

 

'DeepLearning' 카테고리의 다른 글

EMA (Exponential Moving Average) 알고리즘  (0) 2022.07.08
Adagrad->RMSProp, Adam -> AMSGrad  (0) 2022.07.08
Semi-Supervised Learning  (0) 2022.06.08
[math] Variance, Covariance  (0) 2022.05.31
[논문리뷰] Attention is all you need  (0) 2022.05.20