Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
-
Updated
Jul 14, 2019 - Python
Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
Super-convergence implementation on TensorFlow Slim
SGLD and cSGLD as a PyTorch Optimizer
Re-training the deep neural network DenseNet using various learning rate strategies. Entry for the Food Recognition Challenge for the Master's course Applied Machine Learning.
Experiments on the paper Super-Convergence
Add a description, image, and links to the cyclical-learning-rate topic page so that developers can more easily learn about it.
To associate your repository with the cyclical-learning-rate topic, visit your repo's landing page and select "manage topics."