Skip to content

Latest commit

 

History

History
15 lines (9 loc) · 436 Bytes

README.md

File metadata and controls

15 lines (9 loc) · 436 Bytes

pytorch-gradual-warmup-lr

Gradually warm-up(increasing) learning rate for pytorch's optimizer. Proposed in 'Accurate, Large Minibatch SGD: Training ImageNet in 1 Hour'.

Install

$ pip install git+https://github.com/ildoonet/pytorch-gradual-warmup-lr.git

Usage

Example : Gradual Warmup for 5 epoch, after that, use cosine-annealing. See run.py file.