Skip to content

Diponly/Computer_Vision_Models

Repository files navigation

learning rate annealing, recommends starting with a relatively high learning rate and then gradually lowering the learning rate during training. The intuition behind this approach is that we'd like to traverse quickly from the initial parameters to a range of "good" parameter values but then we'd like a learning rate small enough that we can explore the "deeper, but narrower parts of the loss function"

Ref: https://cs231n.github.io/neural-networks-3/#annealing-the-learning-rate

Ref: https://www.jeremyjordan.me/nn-learning-rate/

alt text

About

back bone models for Unet,Resnet50,renet34,Vgg,

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages